datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
um-ids/dailymed-annotations | ---
license: cc-by-4.0
---
1. Dailymed prescription indications
|
open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M | ---
pretty_name: Evaluation run of synapsoft/Llama-2-7b-hf-flan2022-1.2M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [synapsoft/Llama-2-7b-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T06:59:34.296378](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M/blob/main/results_2023-10-13T06-59-34.296378.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25870385906040266,\n\
\ \"em_stderr\": 0.004484736946763185,\n \"f1\": 0.2965908137583894,\n\
\ \"f1_stderr\": 0.004480084563201026,\n \"acc\": 0.40002920104621126,\n\
\ \"acc_stderr\": 0.008888005892783395\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.25870385906040266,\n \"em_stderr\": 0.004484736946763185,\n\
\ \"f1\": 0.2965908137583894,\n \"f1_stderr\": 0.004480084563201026\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \
\ \"acc_stderr\": 0.005693886131407052\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ }\n}\n```"
repo_url: https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|arc:challenge|25_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T06_59_34.296378
path:
- '**/details_harness|drop|3_2023-10-13T06-59-34.296378.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T06-59-34.296378.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T06_59_34.296378
path:
- '**/details_harness|gsm8k|5_2023-10-13T06-59-34.296378.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T06-59-34.296378.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hellaswag|10_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T06_59_34.296378
path:
- '**/details_harness|winogrande|5_2023-10-13T06-59-34.296378.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T06-59-34.296378.parquet'
- config_name: results
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- results_2023-08-29T11:38:40.621041.parquet
- split: 2023_10_13T06_59_34.296378
path:
- results_2023-10-13T06-59-34.296378.parquet
- split: latest
path:
- results_2023-10-13T06-59-34.296378.parquet
---
# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-hf-flan2022-1.2M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [synapsoft/Llama-2-7b-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T06:59:34.296378](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M/blob/main/results_2023-10-13T06-59-34.296378.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.25870385906040266,
"em_stderr": 0.004484736946763185,
"f1": 0.2965908137583894,
"f1_stderr": 0.004480084563201026,
"acc": 0.40002920104621126,
"acc_stderr": 0.008888005892783395
},
"harness|drop|3": {
"em": 0.25870385906040266,
"em_stderr": 0.004484736946763185,
"f1": 0.2965908137583894,
"f1_stderr": 0.004480084563201026
},
"harness|gsm8k|5": {
"acc": 0.04473085670962851,
"acc_stderr": 0.005693886131407052
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Qwen__Qwen2-beta-72B | ---
pretty_name: Evaluation run of Qwen/Qwen2-beta-72B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Qwen/Qwen2-beta-72B](https://huggingface.co/Qwen/Qwen2-beta-72B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen2-beta-72B_private\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T23:47:04.571636](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen2-beta-72B_private/blob/main/results_2024-01-29T23-47-04.571636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7669743429877653,\n\
\ \"acc_stderr\": 0.027971495069922473,\n \"acc_norm\": 0.7715834368806984,\n\
\ \"acc_norm_stderr\": 0.028493498109494097,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.596080564321232,\n\
\ \"mc2_stderr\": 0.01451800985281567\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759095,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.01385583128749773\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6666998605855408,\n\
\ \"acc_stderr\": 0.004704293898729911,\n \"acc_norm\": 0.8598884684325832,\n\
\ \"acc_norm_stderr\": 0.003463933286063887\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846938,\n\
\ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846938\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n\
\ \"acc_stderr\": 0.024774516250440175,\n \"acc_norm\": 0.9027777777777778,\n\
\ \"acc_norm_stderr\": 0.024774516250440175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8085106382978723,\n \"acc_stderr\": 0.025722149992637798,\n\
\ \"acc_norm\": 0.8085106382978723,\n \"acc_norm_stderr\": 0.025722149992637798\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6984126984126984,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\"\
: 0.6984126984126984,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\
\ \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n\
\ \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.0182631054201995,\n \"acc_norm\"\
: 0.9292929292929293,\n \"acc_norm_stderr\": 0.0182631054201995\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \
\ \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4925925925925926,\n \"acc_stderr\": 0.030482192395191506,\n \
\ \"acc_norm\": 0.4925925925925926,\n \"acc_norm_stderr\": 0.030482192395191506\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5761589403973509,\n \"acc_stderr\": 0.04034846678603396,\n \"\
acc_norm\": 0.5761589403973509,\n \"acc_norm_stderr\": 0.04034846678603396\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9302752293577982,\n \"acc_stderr\": 0.01091942641184862,\n \"\
acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.01091942641184862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\"\
: 0.6851851851851852,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n\
\ \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n\
\ \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640273,\n\
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640273\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073892,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073892\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9157088122605364,\n\
\ \"acc_stderr\": 0.009934966499513786,\n \"acc_norm\": 0.9157088122605364,\n\
\ \"acc_norm_stderr\": 0.009934966499513786\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442265,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442265\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6346368715083799,\n\
\ \"acc_stderr\": 0.016104833880142302,\n \"acc_norm\": 0.6346368715083799,\n\
\ \"acc_norm_stderr\": 0.016104833880142302\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.01989943546353996,\n\
\ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.01989943546353996\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n\
\ \"acc_stderr\": 0.020862388082391888,\n \"acc_norm\": 0.8392282958199357,\n\
\ \"acc_norm_stderr\": 0.020862388082391888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.0190615881815054,\n\
\ \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.0190615881815054\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6276595744680851,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6121251629726207,\n\
\ \"acc_stderr\": 0.012444998309675631,\n \"acc_norm\": 0.6121251629726207,\n\
\ \"acc_norm_stderr\": 0.012444998309675631\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n\
\ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8137254901960784,\n \"acc_stderr\": 0.01575052628436337,\n \
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.01575052628436337\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650156,\n\
\ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650156\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.596080564321232,\n\
\ \"mc2_stderr\": 0.01451800985281567\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363696\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6573161485974223,\n \
\ \"acc_stderr\": 0.013073030230827912\n }\n}\n```"
repo_url: https://huggingface.co/Qwen/Qwen2-beta-72B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|arc:challenge|25_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|gsm8k|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hellaswag|10_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T23-47-04.571636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T23-47-04.571636.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- '**/details_harness|winogrande|5_2024-01-29T23-47-04.571636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T23-47-04.571636.parquet'
- config_name: results
data_files:
- split: 2024_01_29T23_47_04.571636
path:
- results_2024-01-29T23-47-04.571636.parquet
- split: latest
path:
- results_2024-01-29T23-47-04.571636.parquet
---
# Dataset Card for Evaluation run of Qwen/Qwen2-beta-72B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Qwen/Qwen2-beta-72B](https://huggingface.co/Qwen/Qwen2-beta-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen2-beta-72B_private",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T23:47:04.571636](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen2-beta-72B_private/blob/main/results_2024-01-29T23-47-04.571636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7669743429877653,
"acc_stderr": 0.027971495069922473,
"acc_norm": 0.7715834368806984,
"acc_norm_stderr": 0.028493498109494097,
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.596080564321232,
"mc2_stderr": 0.01451800985281567
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759095,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.01385583128749773
},
"harness|hellaswag|10": {
"acc": 0.6666998605855408,
"acc_stderr": 0.004704293898729911,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.003463933286063887
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846938,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846938
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440175,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8085106382978723,
"acc_stderr": 0.025722149992637798,
"acc_norm": 0.8085106382978723,
"acc_norm_stderr": 0.025722149992637798
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6984126984126984,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.6984126984126984,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.0182631054201995,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.0182631054201995
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.01967163241310029,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.01967163241310029
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4925925925925926,
"acc_stderr": 0.030482192395191506,
"acc_norm": 0.4925925925925926,
"acc_norm_stderr": 0.030482192395191506
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5761589403973509,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.5761589403973509,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.01091942641184862,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.01091942641184862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640273,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640273
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073892,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073892
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9157088122605364,
"acc_stderr": 0.009934966499513786,
"acc_norm": 0.9157088122605364,
"acc_norm_stderr": 0.009934966499513786
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442265,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442265
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6346368715083799,
"acc_stderr": 0.016104833880142302,
"acc_norm": 0.6346368715083799,
"acc_norm_stderr": 0.016104833880142302
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.01989943546353996,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.01989943546353996
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.020862388082391888,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.020862388082391888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.0190615881815054,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.0190615881815054
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6121251629726207,
"acc_stderr": 0.012444998309675631,
"acc_norm": 0.6121251629726207,
"acc_norm_stderr": 0.012444998309675631
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.01575052628436337,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.01575052628436337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650156,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.596080564321232,
"mc2_stderr": 0.01451800985281567
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363696
},
"harness|gsm8k|5": {
"acc": 0.6573161485974223,
"acc_stderr": 0.013073030230827912
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zenml/rag_qa_embedding_questions | ---
dataset_info:
features:
- name: page_content
dtype: string
- name: filename
dtype: string
- name: parent_section
dtype: string
- name: url
dtype: string
- name: embedding
sequence: float64
- name: token_count
dtype: int64
- name: generated_questions
sequence: string
- name: __pydantic_initialised__
dtype: bool
splits:
- name: train
num_bytes: 10042718
num_examples: 1806
download_size: 5948133
dataset_size: 10042718
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JonnyHsu/testtest | ---
license: openrail
---
|
Blessin/dialogues-one-liners | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_131 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 822836648.0
num_examples: 161594
download_size: 837715041
dataset_size: 822836648.0
---
# Dataset Card for "chunk_131"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osmzrl/ofpptv2 | ---
license: apache-2.0
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_106 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 887841120.0
num_examples: 174360
download_size: 905816663
dataset_size: 887841120.0
---
# Dataset Card for "chunk_106"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
satellite-image-deep-learning/DOTAv2 | ---
license: cc-by-4.0
tags:
- remote-sensing
- oriented-bounding-boxes
- object-detection
---
DOTA v2 Dataset with OBB, specifically the version from the [Ultralytics docs](https://docs.ultralytics.com/datasets/obb/dota-v2/)
- [Website](https://captain-whu.github.io/DOTA/dataset.html)

## Full License
Here reproduced from the website webpage
License for Academic Non-Commercial Use Only
This DOTA dataset is made available under the following terms:
1. The Google Earth images in this dataset are subject to Google Earth's terms of use, which must be adhered to.
2. The GF-2 and JL-1 satellite images are provided by the China Centre for Resources Satellite Data and Application. The aerial images are provided by CycloMedia B.V.
3. Permission is hereby granted, free of charge, to any person obtaining a copy of this dataset to use it for academic, research, and other non-commercial uses only.
4. Redistribution, modification, or commercial use of this dataset or any portion of it is strictly prohibited without explicit permission from the copyright holder.
5. Any academic work that makes use of this dataset should include a citation to the dataset source.
All rights not expressly granted are reserved.
|
indiejoseph/wikipedia-zh-yue-filtered | ---
license: cc-by-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 90299602
num_examples: 133133
download_size: 56260688
dataset_size: 90299602
---
|
jenhsia/ragged_id2title | ---
license: mit
dataset_info:
- config_name: id2title_demo
features:
- name: id
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 509
num_examples: 8
download_size: 1866
dataset_size: 509
- config_name: kilt_wikipedia_id2title
features:
- name: id
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 187669689
num_examples: 5903530
download_size: 147063565
dataset_size: 187669689
configs:
- config_name: id2title_demo
data_files:
- split: train
path: id2title_demo/train-*
- config_name: kilt_wikipedia_id2title
data_files:
- split: train
path: kilt_wikipedia_id2title/train-*
---
|
Atsushi/fungi_indexed_mycological_papers_japanese | ---
annotations_creators:
- other
language:
- ja
license:
- cc-by-4.0
multilinguality:
- monolingual
source_datasets:
- original
size_categories:
- 1K<n<10K
---
fungi_indexed_mycological_papers_japanese
大菌輪「論文3行まとめ」データセット
最終更新日:2024/2/23(R3-11457まで)
====
### Languages
Japanese
This dataset is available in Japanese only.
# 概要
Atsushi Nakajima(中島淳志)が個人で運営しているWebサイト[大菌輪](http://mycoscouter.coolblog.jp/daikinrin/) では、数千件以上の菌類分類学論文を「論文3行まとめ」という形で要約および索引付け(インデキシング)した情報を提供しています。
本データセットは、「論文3行まとめ」のコンテンツに含まれる各論文の3行抄録、タグ(索引)、掲載種一覧、比較種一覧をまとめたものです。
「論文3行まとめ」は毎日更新していますが、本データセットの更新はおおむね1ヶ月に一度とする予定です。
また、本データセットを可視化したWebアプリを[Observableで公開](https://tinyurl.com/2tvryz8u)しています。
## 関連データセット
「識別形質まとめ」
[Atsushi/fungi_diagnostic_chars_comparison_japanese](https://huggingface.co/datasets/Atsushi/fungi_diagnostic_chars_comparison_japanese)
「Trait Circusデータセット」(統制形質)
[Atsushi/fungi_trait_circus_database](https://huggingface.co/datasets/Atsushi/fungi_trait_circus_database)
## 各カラムの説明
* R3ID … 大菌輪「論文3行まとめ」のIDです。
* ja_title_provisional_translate(仮訳和文題名) … 作成者が翻訳したタイトルです。一部、日本語の原題があるものはそれをそのまま使用しています。
* original_title(原文題名)
* published_year(出版年)
* journal_title(雑誌名)
* source(文献リンク) … 各情報の 出典(文献)のURLです。
* daikinrin_url … 大菌輪「論文3行まとめ」のURLです。
* tags … 作成者が論文を全文読んだ上で独自に付与した索引です。カンマ+半角空白区切りです。形態形質、宿主/基質、実験器具/実験手法/試薬、地理的分布、生理/生化学などを幅広く索引しています。
* R3summary_1 … 3行抄録の「1行目」です。
* R3summary_2 … 3行抄録の「2行目」です。
* R3summary_3 … 3行抄録の「3行目」です。
* species_reported(報告種一覧) … 当該論文内で掲載された種の一覧です。「半角空白+半角スラッシュ+半角空白」区切りです。記号の意味は以下の通りです。
* ★=新種(新亜種・新品種・新変種)
* ■= 新産種
* ▲=新組み合わせ
* ◆=新学名
* ●=新階級
* (無印)=その他
* species_compared(比較種一覧) … いずれかの報告種と論文中で何らかの比較がなされた種の一覧です。「半角空白+半角スラッシュ+半角空白」区切りです。詳細は「識別形質まとめ」データセット([Atsushi/fungi_diagnostic_chars_comparison_japanese](https://huggingface.co/datasets/Atsushi/fungi_diagnostic_chars_comparison_japanese))を参照してください。
* taxon_reported(分類群一覧) … 報告種に対応する上位分類群をまとめたものです。カンマ+半角空白区切りです。MycoBankの情報を基に付与していますが、最新でない可能性があります。 |
abhinand/tamil-llama-eval | ---
language:
- ta
license: gpl
size_categories:
- n<1K
task_categories:
- text-generation
pretty_name: tamil-llama-eval
dataset_info:
config_name: large
features:
- name: input
dtype: string
- name: raw_input
dtype: string
- name: evol_source
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 1077035
num_examples: 956
download_size: 347891
dataset_size: 1077035
configs:
- config_name: large
data_files:
- split: train
path: large/train-*
---
# Dataset Card for "tamil-alpaca-eval"
This repository includes evaluation instructions to quickly test the Tamil LLaMA family of instruction models. To dive deep into the development and capabilities of the models, please read the [research paper](https://arxiv.org/abs/2311.05845) and the [introductory blog post (WIP) ]() that outlines our journey and the model's potential impact.
**GitHub Repository:** [https://github.com/abhinand5/tamil-llama](https://github.com/abhinand5/tamil-llama)
**Note:** This is the second version of the evaluation dataset was created using [Evol Instruct](https://arxiv.org/pdf/2304.12244.pdf) methodology and GPT-4. The initial 120 questions in [Tamil-Llama-Eval.csv](https://huggingface.co/datasets/abhinand/tamil-llama-eval/blob/main/Tamil-LLaMA-Eval.csv) (v1) were used as seed instructions.
## Models evaluated using this dataset
| Task Type | [Tamil-LLaMA-7B](abhinand/tamil-llama-7b-instruct-v0.1) | [Tamil-LLaMA-13B](abhinand/tamil-llama-13b-instruct-v0.1) | [gpt-3.5-turbo](https://platform.openai.com/docs/models/gpt-3-5) |
|-----------------|----------------|-----------------|---------------|
| Question Answering | 77.00 | 75.33 | 54.33 |
| Open-ended QA | 84.47 | 85.26 | 58.68 |
| Reasoning | 47.50 | 64.25 | 63.50 |
| Literature | 45.50 | 40.00 | 71.00 |
| Entertainment | 43.33 | 50.00 | 60.00 |
| Creative Writing| 92.50 | 95.62 | 59.69 |
| Translation | 60.56 | 66.67 | 92.78 |
| Coding | 63.57 | 76.07 | 57.14 |
| Ethics | 23.75 | 57.50 | 40.00 |
| **Overall** | **63.83** | **71.17** | **61.33** |
## Meet the Developers
Get to know the creators behind this innovative model and follow their contributions to the field:
- [Abhinand Balachandran](https://www.linkedin.com/in/abhinand-05/)
## Citation
If you use this model or any of the the Tamil-Llama datasets in your research, please cite:
```bibtex
@misc{balachandran2023tamilllama,
title={Tamil-Llama: A New Tamil Language Model Based on Llama 2},
author={Abhinand Balachandran},
year={2023},
eprint={2311.05845},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
joey234/mmlu-moral_disputes-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 29708
num_examples: 53
download_size: 22744
dataset_size: 29708
---
# Dataset Card for "mmlu-moral_disputes-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Violetmae14/image-to-sound-with-effects-using-videos | ---
license: artistic-2.0
---
image to sounds using facebook: images here?
space: button, images and audios, |
Gregor/mblip-train | ---
license: other
language:
- en
- multilingual
pretty_name: mBLIP instructions
---
# mBLIP Instruct Mix Dataset Card
## Important!
This dataset currently does not work directly with `datasets.load_dataset(Gregor/mblip-train)`!
Please download the data files you need and load them with `datasets.load_dataset("json", data_files="filename")`.
## Dataset details
**Dataset type:**
This is the instruction mix used to train [mBLIP](https://github.com/gregor-ge/mBLIP).
See https://github.com/gregor-ge/mBLIP/data/README.md for more information on how to reproduce the data.
**Dataset date:**
The dataset was created in May 2023.
**Dataset languages:**
The original English examples were machine translated to the following 95 languages:
`
af, am, ar, az, be, bg, bn, ca, ceb, cs, cy, da, de, el, en, eo, es, et, eu, fa, fi, fil, fr, ga, gd, gl, gu, ha, hi, ht, hu, hy, id, ig, is, it, iw, ja, jv, ka, kk, km, kn, ko, ku, ky, lb, lo, lt, lv, mg, mi, mk, ml, mn, mr, ms, mt, my, ne, nl, no, ny, pa, pl, ps, pt, ro, ru, sd, si, sk, sl, sm, sn, so, sq, sr, st, su, sv, sw, ta, te, tg, th, tr, uk, ur, uz, vi, xh, yi, yo, zh, zu
`
Languages are translated proportional to their size in [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual), i.e., as 6% of examples in mC4 are German, we translate 6% of the data to German.
**Dataset structure:**
- `task_mix_mt.json`: The instruction mix data in the processed, translated, and combined form.
- Folders: The folders contain 1) the separate tasks used to generate the mix
and 2) the files of the tasks used to evaluate the model.
**Images:**
We do not include any images with this dataset.
Images from the public datasets (MSCOCO for instruction training, and others for evaluation) can be downloaded
from the respective websites.
For the BLIP captions, we provide the URLs and filenames as used by us [here](blip_captions/ccs_synthetic_filtered_large_2273005_raw.json).
To download them, [our code](https://github.com/gregor-ge/mBLIP/tree/main/data#blip-web-capfilt) can be adapted, for example.
**License:**
Must comply with license of the original datasets used to create this mix. See https://github.com/gregor-ge/mBLIP/data/README.md for more.
Translations were produced with [NLLB](https://huggingface.co/facebook/nllb-200-distilled-1.3B) so use has to comply with
their license.
**Where to send questions or comments about the model:**
https://github.com/gregor-ge/mBLIP/issues
## Intended use
**Primary intended uses:**
The primary is research on large multilingual multimodal models and chatbots.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. |
divers/jobsedcription-requirement | ---
dataset_info:
features:
- name: index
dtype: int64
- name: job_description
dtype: string
- name: job_requirements
dtype: string
- name: unknown
dtype: float64
- name: __index_level_0__
dtype: float64
splits:
- name: train
num_bytes: 25599853
num_examples: 4551
download_size: 12633905
dataset_size: 25599853
---
# Dataset Card for "jobsedcription-requirement"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
realPCH/trutuful_ko | ---
dataset_info:
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
dtype: string
- name: labels
dtype: string
- name: mc2_targets
struct:
- name: choices
dtype: string
- name: labels
dtype: string
splits:
- name: validation
num_bytes: 2788
num_examples: 3
download_size: 9265
dataset_size: 2788
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
vwxyzjn/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_1706381144 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_response_label
sequence: int64
- name: query_reference_response_token_len
dtype: int64
splits:
- name: train
num_bytes: 2125689249
num_examples: 116722
- name: validation
num_bytes: 117437271
num_examples: 6447
- name: test
num_bytes: 119410966
num_examples: 6553
download_size: 562087836
dataset_size: 2362537486
---
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task
The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset
These columns are taken directly from the aforementioned dataset:
* **id**: unique identifier for the post
* **subreddit**: subreddit the post was taken from
* **title**: title of the post
* **post**: body of the post
* **summary**: summary of the post
* **reference_response**: reference response for the post
These columns are added by this preprocessing script:
* **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last `
`. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below).
* **query_token**: tokenized version of `query`
* **reference_response_token**: tokenized version of `reference_response`
* **reference_response_token_len**: length of `reference_response_token`
* **query_reference_response**: concatenation of `query.strip()` and `reference_response`
* **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens
* **query_reference_response_token_len**: length of `query_reference_response_token`
# Args
```python
{'base_model': 'EleutherAI/pythia-1b-deduped',
'check_length_correctness': True,
'cnndm_params': TaskQueryHParams(length=1919,
format_str='Article:\n{article}\n\nTL;DR:\n',
truncate_field='article',
truncate_text='\n',
padding='pad_token',
pad_token=[50277],
pad_side='left',
max_sft_response_length=None,
max_sft_query_response_length=None,
max_rm_response_length=155,
max_rm_query_response_length=2021),
'debug': False,
'hf_entity': 'vwxyzjn',
'push_to_hub': True,
'tldr_params': TaskQueryHParams(length=512,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[50277],
pad_side='left',
max_sft_response_length=53,
max_sft_query_response_length=562,
max_rm_response_length=169,
max_rm_query_response_length=638)}
```
|
gymprathap/Brain-MRI-LGG-Segmentation | ---
license: cc-by-4.0
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
---
Brain MR images and FLAIR abnormality segmentation masks created by hand are part of this dataset.
These pictures came from TCIA, or The Cancer Imaging Archive.
Their genetic cluster data and fluid-attenuated inversion recovery (FLAIR) sequences are from 110 patients with lower-grade glioma who are part of the Cancer Genome Atlas (TCGA) collection.
You can find patient information and genomic clusters of tumours in the data.csv file.
<a href="http://projectcentersinchennai.co.in/Final-Year-Projects-for-CSE/Final-Year-Projects-for-CSE-Deep-learning-Domain" title="Deep Learning Projects for Final Year">Deep Learning Projects for Final Year</a>
FYI: It is not my dataset. I got it from kaggle. |
open-llm-leaderboard/details_jefferylovely__SuperThetaMaven | ---
pretty_name: Evaluation run of jefferylovely/SuperThetaMaven
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jefferylovely/SuperThetaMaven](https://huggingface.co/jefferylovely/SuperThetaMaven)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jefferylovely__SuperThetaMaven\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T04:35:28.673518](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__SuperThetaMaven/blob/main/results_2024-02-02T04-35-28.673518.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545059772983542,\n\
\ \"acc_stderr\": 0.03205377283844669,\n \"acc_norm\": 0.6538200997878416,\n\
\ \"acc_norm_stderr\": 0.03272641534569135,\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.7177387118634652,\n\
\ \"mc2_stderr\": 0.014774281827372924\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.012875929151297042\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7144991037641903,\n\
\ \"acc_stderr\": 0.004507296196227809,\n \"acc_norm\": 0.8899621589324835,\n\
\ \"acc_norm_stderr\": 0.0031229736320394727\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662264,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662264\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.016553287863116037,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.016553287863116037\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079064,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079064\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.7177387118634652,\n\
\ \"mc2_stderr\": 0.014774281827372924\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693633\n }\n}\n```"
repo_url: https://huggingface.co/jefferylovely/SuperThetaMaven
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|arc:challenge|25_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|gsm8k|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hellaswag|10_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-35-28.673518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T04-35-28.673518.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- '**/details_harness|winogrande|5_2024-02-02T04-35-28.673518.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T04-35-28.673518.parquet'
- config_name: results
data_files:
- split: 2024_02_02T04_35_28.673518
path:
- results_2024-02-02T04-35-28.673518.parquet
- split: latest
path:
- results_2024-02-02T04-35-28.673518.parquet
---
# Dataset Card for Evaluation run of jefferylovely/SuperThetaMaven
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jefferylovely/SuperThetaMaven](https://huggingface.co/jefferylovely/SuperThetaMaven) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jefferylovely__SuperThetaMaven",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T04:35:28.673518](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__SuperThetaMaven/blob/main/results_2024-02-02T04-35-28.673518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545059772983542,
"acc_stderr": 0.03205377283844669,
"acc_norm": 0.6538200997878416,
"acc_norm_stderr": 0.03272641534569135,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.7177387118634652,
"mc2_stderr": 0.014774281827372924
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.012875929151297042
},
"harness|hellaswag|10": {
"acc": 0.7144991037641903,
"acc_stderr": 0.004507296196227809,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.0031229736320394727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662264,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662264
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116037,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116037
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079064,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.7177387118634652,
"mc2_stderr": 0.014774281827372924
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693633
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_sr5434__CodegebraGPT-10b | ---
pretty_name: Evaluation run of sr5434/CodegebraGPT-10b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sr5434/CodegebraGPT-10b](https://huggingface.co/sr5434/CodegebraGPT-10b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sr5434__CodegebraGPT-10b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T01:29:14.413360](https://huggingface.co/datasets/open-llm-leaderboard/details_sr5434__CodegebraGPT-10b/blob/main/results_2024-01-05T01-29-14.413360.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6027565325073367,\n\
\ \"acc_stderr\": 0.03330413486893907,\n \"acc_norm\": 0.605828746614574,\n\
\ \"acc_norm_stderr\": 0.03399227272363922,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.46569772860679925,\n\
\ \"mc2_stderr\": 0.014510196356063874\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348911,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6385182234614618,\n\
\ \"acc_stderr\": 0.004794478426382608,\n \"acc_norm\": 0.834196375224059,\n\
\ \"acc_norm_stderr\": 0.0037114419828661815\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764826,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296535,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296535\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250955,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250955\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139946,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139946\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400175,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400175\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192714,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829163,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829163\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387345,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387345\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.46569772860679925,\n\
\ \"mc2_stderr\": 0.014510196356063874\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.011030335798617442\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45109931766489764,\n \
\ \"acc_stderr\": 0.013706458809664817\n }\n}\n```"
repo_url: https://huggingface.co/sr5434/CodegebraGPT-10b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|arc:challenge|25_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|arc:challenge|25_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|gsm8k|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|gsm8k|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hellaswag|10_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hellaswag|10_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T15-18-52.631261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T01-29-14.413360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T01-29-14.413360.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- '**/details_harness|winogrande|5_2023-12-30T15-18-52.631261.parquet'
- split: 2024_01_05T01_29_14.413360
path:
- '**/details_harness|winogrande|5_2024-01-05T01-29-14.413360.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T01-29-14.413360.parquet'
- config_name: results
data_files:
- split: 2023_12_30T15_18_52.631261
path:
- results_2023-12-30T15-18-52.631261.parquet
- split: 2024_01_05T01_29_14.413360
path:
- results_2024-01-05T01-29-14.413360.parquet
- split: latest
path:
- results_2024-01-05T01-29-14.413360.parquet
---
# Dataset Card for Evaluation run of sr5434/CodegebraGPT-10b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sr5434/CodegebraGPT-10b](https://huggingface.co/sr5434/CodegebraGPT-10b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sr5434__CodegebraGPT-10b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T01:29:14.413360](https://huggingface.co/datasets/open-llm-leaderboard/details_sr5434__CodegebraGPT-10b/blob/main/results_2024-01-05T01-29-14.413360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6027565325073367,
"acc_stderr": 0.03330413486893907,
"acc_norm": 0.605828746614574,
"acc_norm_stderr": 0.03399227272363922,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.46569772860679925,
"mc2_stderr": 0.014510196356063874
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348911,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.6385182234614618,
"acc_stderr": 0.004794478426382608,
"acc_norm": 0.834196375224059,
"acc_norm_stderr": 0.0037114419828661815
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764826,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296535,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296535
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250955,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250955
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139946,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139946
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400175,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400175
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903219,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903219
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192714,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829163,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829163
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387345,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387345
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.46569772860679925,
"mc2_stderr": 0.014510196356063874
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.011030335798617442
},
"harness|gsm8k|5": {
"acc": 0.45109931766489764,
"acc_stderr": 0.013706458809664817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_it_is_referential | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 727
num_examples: 4
- name: test
num_bytes: 247
num_examples: 2
- name: train
num_bytes: 1164
num_examples: 6
download_size: 9992
dataset_size: 2138
---
# Dataset Card for "MULTI_VALUE_stsb_it_is_referential"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distinsion/generated_images | ---
size_categories: n<1K
config_names:
- text_field_for_argilla
tags:
- synthetic
- distilabel
- rlaif
---
<p align="left">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# Dataset Card for generated_images
This dataset has been created with [Distilabel](https://distilabel.argilla.io/).
## Dataset Summary
This dataset contains a `pipeline.yaml` which can be used to reproduce the pipeline that generated it in distilabel using the `distilabel` CLI:
```console
distilabel pipeline run --config "https://huggingface.co/datasets/distinsion/generated_images/raw/main/pipeline.yaml"
```
or explore the configuration:
```console
distilabel pipeline info --config "https://huggingface.co/datasets/distinsion/generated_images/raw/main/pipeline.yaml"
```
## Dataset structure
The examples have the following structure per configuration:
<details><summary> Configuration: text_field_for_argilla </summary><hr>
```json
{
"evolved_instruction": "Can you identify the species of this bird?",
"generated_image_url": "https://oaidalleapiprodscus.blob.core.windows.net/private/org-2FK3RSLIytfK9EtweFtnnAfg/user-hfvBma1cXbdtoCf3WES95wiB/img-7xYV261geAAr9mjtMy829srS.png?st=2024-04-12T09%3A20%3A52Z\u0026se=2024-04-12T11%3A20%3A52Z\u0026sp=r\u0026sv=2021-08-06\u0026sr=b\u0026rscd=inline\u0026rsct=image/png\u0026skoid=6aaadede-4fb3-4698-a8f6-684d7786b067\u0026sktid=a48cca56-e6da-484e-a814-9c849652bcb3\u0026skt=2024-04-12T08%3A04%3A02Z\u0026ske=2024-04-13T08%3A04%3A02Z\u0026sks=b\u0026skv=2021-08-06\u0026sig=4zCigddawumNANK4bsEREFv2wUA%2BeKZwzQ/4YvmlaOQ%3D",
"generated_instruction": "What species is depicted in the image, and can you provide some details about its natural habitat and adaptations?",
"generated_response": "The image shows a llama, which is native to the Andes Mountains in South America. Llamas are well adapted to the high altitude and cold temperatures of the mountains, with their thick",
"generation": "The image you provided is actually of a Highland cow, not a bird. Highland cows are a Scottish breed of cattle known for their long horns and shaggy coats. They are well adapted to the cold weather of the Scottish Highlands.",
"image_gen_prompt": "Create an image of a llama standing in a mountain grassland setting, with the Andes Mountains visible in the background. The llama should appear fluffy and predominantly white with some light brown patches. It should be shown early in the morning with light mist swirling around, suggesting a chilly climate.",
"instruction_with_image": [
{
"content": [
{
"image_url": null,
"text": "Can you identify the species of this bird?",
"type": "text"
},
{
"image_url": {
"url": "https://picsum.photos/id/200/1920/1280"
},
"text": null,
"type": "image_url"
}
],
"role": "user"
}
],
"model_name": "gpt-4-turbo-2024-04-09",
"text_field_for_argilla": "**Instruction**: What species is depicted in the image, and can you provide some details about its natural habitat and adaptations?\n\n\n\n**Response**: The image shows a llama, which is native to the Andes Mountains in South America. Llamas are well adapted to the high altitude and cold temperatures of the mountains, with their thick\n"
}
```
This subset can be loaded as:
```python
from datasets import load_dataset
ds = load_dataset("distinsion/generated_images", "text_field_for_argilla")
```
</details>
|
Ondiet/bert_model | ---
license: openrail
---
|
Jephson/edited-sky-dataset-4 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 325824748.72
num_examples: 1120
download_size: 325957005
dataset_size: 325824748.72
---
# Dataset Card for "edited-sky-dataset-4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
delphi-suite/v0-next-logprobs-llama2-200k | ---
dataset_info:
features:
- name: logprobs
sequence: float64
splits:
- name: validation
num_bytes: 45818277
num_examples: 10982
download_size: 37571491
dataset_size: 45818277
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/bcd1c9db | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1341
dataset_size: 184
---
# Dataset Card for "bcd1c9db"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/clinicaltrials_2019 | ---
pretty_name: '`clinicaltrials/2019`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `clinicaltrials/2019`
The `clinicaltrials/2019` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clinicaltrials#clinicaltrials/2019).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=306,238
This dataset is used by: [`clinicaltrials_2019_trec-pm-2019`](https://huggingface.co/datasets/irds/clinicaltrials_2019_trec-pm-2019)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/clinicaltrials_2019', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'condition': ..., 'summary': ..., 'detailed_description': ..., 'eligibility': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
CyberHarem/todoroki_nene_seitokaiyakuindomo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Todoroki Nene (Seitokai Yakuindomo)
This is the dataset of Todoroki Nene (Seitokai Yakuindomo), containing 88 images and their tags.
The core tags of this character are `brown_hair, glasses, brown_eyes, bow, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 88 | 47.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/todoroki_nene_seitokaiyakuindomo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 88 | 39.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/todoroki_nene_seitokaiyakuindomo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 190 | 80.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/todoroki_nene_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 88 | 47.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/todoroki_nene_seitokaiyakuindomo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 190 | 93.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/todoroki_nene_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/todoroki_nene_seitokaiyakuindomo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, school_uniform, solo, blazer, smile |
| 1 | 5 |  |  |  |  |  | 1girl, closed_eyes, school_uniform, smile, solo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | blazer | smile | closed_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:---------|:--------|:--------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | X |
|
communityai/aptchat-v2-general-12k | ---
dataset_info:
features:
- name: category
dtype: string
- name: total_tokens
dtype: int64
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 163318716.0
num_examples: 12197
download_size: 86039185
dataset_size: 163318716.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hiraishin/ujianjpj-tanda-isyarat | ---
license: apache-2.0
---
|
bazyl/GTSRB | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language: []
license:
- gpl-3.0
multilinguality: []
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-label-image-classification
pretty_name: GTSRB
---
# Dataset Card for GTSRB
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** http://www.sciencedirect.com/science/article/pii/S0893608012000457
- **Repository:** https://github.com/bazylhorsey/gtsrb/
- **Paper:** Man vs. computer: Benchmarking machine learning algorithms for traffic sign recognition
- **Leaderboard:** https://benchmark.ini.rub.de/gtsrb_results.html
- **Point of Contact:** bhorsey16@gmail.com
### Dataset Summary
The German Traffic Sign Benchmark is a multi-class, single-image classification challenge held at the International Joint Conference on Neural Networks (IJCNN) 2011. We cordially invite researchers from relevant fields to participate: The competition is designed to allow for participation without special domain knowledge. Our benchmark has the following properties:
- Single-image, multi-class classification problem
- More than 40 classes
- More than 50,000 images in total
- Large, lifelike database
### Supported Tasks and Leaderboards
[Kaggle](https://www.kaggle.com/datasets/meowmeowmeowmeowmeow/gtsrb-german-traffic-sign) \
[Original](https://benchmark.ini.rub.de/gtsrb_results.html)
## Dataset Structure
### Data Instances
```
{
"Width": 31,
"Height": 31,
"Roi.X1": 6,
"Roi.Y1": 6,
"Roi.X2": 26,
"Roi.Y2": 26,
"ClassId": 20,
"Path": "Train/20/00020_00004_00002.png",
}
```
### Data Fields
- Width: width of image
- Height: Height of image
- Roi.X1: Upper left X coordinate
- Roi.Y1: Upper left Y coordinate
- Roi.X2: Lower right t X coordinate
- Roi.Y2: Lower right Y coordinate
- ClassId: Class of image
- Path: Path of image
### Data Splits
Categories: 42
Train: 39209
Test: 12630
## Dataset Creation
### Curation Rationale
Recognition of traffic signs is a challenging real-world problem of high industrial relevance. Although commercial systems have reached the market and several studies on this topic have been published, systematic unbiased comparisons of different approaches are missing and comprehensive benchmark datasets are not freely available.
Traffic sign recognition is a multi-class classification problem with unbalanced class frequencies. Traffic signs can provide a wide range of variations between classes in terms of color, shape, and the presence of pictograms or text. However, there exist subsets of classes (e. g., speed limit signs) that are very similar to each other.
The classifier has to cope with large variations in visual appearances due to illumination changes, partial occlusions, rotations, weather conditions, etc.
Humans are capable of recognizing the large variety of existing road signs with close to 100% correctness. This does not only apply to real-world driving, which provides both context and multiple views of a single traffic sign, but also to the recognition from single images.
<!-- ### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] -->
|
iahlt/alarab_articles | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: description
dtype: string
- name: meta_keywords
sequence: string
- name: tags
sequence: string
- name: public_date
dtype: string
- name: author
dtype: string
- name: subtitle
dtype: string
- name: view
dtype: int64
- name: main_category
dtype: string
- name: sub_category
dtype: string
- name: city
dtype: string
- name: text
dtype: string
- name: title_len
dtype: int64
- name: description_len
dtype: int64
- name: public_date_len
dtype: float64
- name: subtitle_len
dtype: float64
- name: text_len
dtype: int64
splits:
- name: train
num_bytes: 497537790
num_examples: 145069
download_size: 211536806
dataset_size: 497537790
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- feature-extraction
- fill-mask
language:
- ar
---
### Description
Scraped articles from alarab (~145,069 articles)
### Usage
```python
from datasets import load_dataset
ds = load_dataset("iahlt/alarab_articles")
```
### Sample
```json
{'url': 'https://www.alarab.co.il/Article/1038538',
'title': 'وفيات النقب | الحاجة فاطمة البحيري في ذمة الله',
'description': 'انتقلت إلى رحمته تعالى مساء اليوم، السبت، في اللقية الحاجة فاطمة حسن الأسد-البحيري (أم سلمان) عن عمر ناهز 93 عاما. والمرحومة هي أرملة المرحوم علي البحيري.ومن المتوقع أن يتم تشييع جثمانها الطاهر إلى أول منازل الآخرة في مقبرة خربة اللقية صب',
'meta_keywords': ['اخبار اليوم، موقع العرب، اخبار العرب، موقع أخبار ، رياضة ، سياسة ، فن عالمي ، فن عربي ، اقتصاد ، موسيقى ، ترفيه ، ألعاب ، سيارات ، أغاني ، كليبات ، افلام عربية ، صور جميلات العرب ومشاهير العرب'],
'tags': ['حالة الطقس',
'اسعار العملات مقابل الشيكل',
'الطقس',
'حالة الطقس اليوم'],
'public_date': '09/07/22 22:40',
'author': 'ياسر العقبي',
'subtitle': 'وفيات النقب | الحاجة فاطمة البحيري في ذمة الله',
'view': 20,
'main_category': 'أخبار',
'sub_category': 'وفيات',
'city': None,
'text': 'انتقلت إلى رحمته تعالى مساء اليوم، السبت، في اللقية الحاجة فاطمة حسن الأسد-البحيري (أم سلمان) عن عمر ناهز 93 عاما. والمرحومة هي أرملة المرحوم علي البحيري.ومن المتوقع أن يتم تشييع جثمانها الطاهر إلى أول منازل الآخرة في مقبرة خربة اللقية صباح يوم غد الأحد.تقبل التعازي في خيمة العزاء بالقرب من بيت الفقيدة.',
'title_len': 46,
'description_len': 238,
'public_date_len': 14.0,
'subtitle_len': 46.0,
'text_len': 304}
```
### Citation
If you use this dataset, please cite:
```
@InProceedings{iahlt2023alarab_articles,
author = "iahlt",
title = "Arabic Corpus: Alarab",
year = "2023",
publisher = "",
location = "",
}
``` |
open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1 | ---
pretty_name: Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T06:47:29.951488](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1/blob/main/results_2024-01-21T06-47-29.951488.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.59052978065543,\n\
\ \"acc_stderr\": 0.03349268505074206,\n \"acc_norm\": 0.5952047695238794,\n\
\ \"acc_norm_stderr\": 0.03418111471832376,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6325766616332602,\n\
\ \"mc2_stderr\": 0.015487593519142183\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633825,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.01434203648343618\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6301533559051982,\n\
\ \"acc_stderr\": 0.004817763581410245,\n \"acc_norm\": 0.8227444732125074,\n\
\ \"acc_norm_stderr\": 0.0038110434120246627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150191,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150191\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.015788007190185888,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.015788007190185888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.025910063528240875,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.025910063528240875\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.01262334375743002,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.01262334375743002\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6325766616332602,\n\
\ \"mc2_stderr\": 0.015487593519142183\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3752843062926459,\n \
\ \"acc_stderr\": 0.013337170545742934\n }\n}\n```"
repo_url: https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-41-01.110110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-47-29.951488.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- '**/details_harness|winogrande|5_2024-01-21T06-41-01.110110.parquet'
- split: 2024_01_21T06_47_29.951488
path:
- '**/details_harness|winogrande|5_2024-01-21T06-47-29.951488.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T06-47-29.951488.parquet'
- config_name: results
data_files:
- split: 2024_01_21T06_41_01.110110
path:
- results_2024-01-21T06-41-01.110110.parquet
- split: 2024_01_21T06_47_29.951488
path:
- results_2024-01-21T06-47-29.951488.parquet
- split: latest
path:
- results_2024-01-21T06-47-29.951488.parquet
---
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T06:47:29.951488](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1/blob/main/results_2024-01-21T06-47-29.951488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.59052978065543,
"acc_stderr": 0.03349268505074206,
"acc_norm": 0.5952047695238794,
"acc_norm_stderr": 0.03418111471832376,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6325766616332602,
"mc2_stderr": 0.015487593519142183
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633825,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.01434203648343618
},
"harness|hellaswag|10": {
"acc": 0.6301533559051982,
"acc_stderr": 0.004817763581410245,
"acc_norm": 0.8227444732125074,
"acc_norm_stderr": 0.0038110434120246627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150191,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150191
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185888,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.025910063528240875,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.025910063528240875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.01262334375743002,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.01262334375743002
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354022,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6325766616332602,
"mc2_stderr": 0.015487593519142183
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838229
},
"harness|gsm8k|5": {
"acc": 0.3752843062926459,
"acc_stderr": 0.013337170545742934
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/quincy_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of quincy/クインシー/昆西 (Azur Lane)
This is the dataset of quincy/クインシー/昆西 (Azur Lane), containing 28 images and their tags.
The core tags of this character are `long_hair, pink_hair, breasts, large_breasts, brown_eyes, braid, ahoge, bangs, hair_between_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 25.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quincy_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 18.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quincy_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 59 | 34.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quincy_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 23.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quincy_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 59 | 42.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quincy_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/quincy_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, red_necktie, simple_background, skirt, blush, open_mouth, white_background, white_thighhighs, airplane, between_breasts, full_body, machinery, rigging, zettai_ryouiki |
| 1 | 6 |  |  |  |  |  | 1girl, blush, solo, sun_hat, cleavage, collarbone, looking_at_viewer, navel, see-through, side-tie_bikini_bottom, hat_ribbon, purple_bikini, smile, white_headwear, bare_shoulders, closed_mouth, flower, hat_bow, inflatable_toy, thighs, wariza |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | red_necktie | simple_background | skirt | blush | open_mouth | white_background | white_thighhighs | airplane | between_breasts | full_body | machinery | rigging | zettai_ryouiki | sun_hat | collarbone | navel | see-through | side-tie_bikini_bottom | hat_ribbon | purple_bikini | smile | white_headwear | bare_shoulders | closed_mouth | flower | hat_bow | inflatable_toy | thighs | wariza |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:--------------|:--------------------|:--------|:--------|:-------------|:-------------------|:-------------------|:-----------|:------------------|:------------|:------------|:----------|:-----------------|:----------|:-------------|:--------|:--------------|:-------------------------|:-------------|:----------------|:--------|:-----------------|:-----------------|:---------------|:---------|:----------|:-----------------|:---------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
HuggingFaceTB/wiki_formal_sciences_college_students_1k | ---
dataset_info:
features:
- name: top_category
dtype: string
- name: subcategory_1
dtype: string
- name: subcategory_2
dtype: string
- name: subcategory_3
dtype: string
- name: subcategory_4
dtype: string
- name: subcategory_5
dtype: string
- name: prompt
dtype: string
- name: completion
dtype: string
- name: token_length
dtype: int64
splits:
- name: train
num_bytes: 6422523
num_examples: 1000
download_size: 3183994
dataset_size: 6422523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nateraw/background-remover-files | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_openchat__openchat_v3.2_super | ---
pretty_name: Evaluation run of openchat/openchat_v3.2_super
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat_v3.2_super](https://huggingface.co/openchat/openchat_v3.2_super)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v3.2_super\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T01:02:51.015590](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2_super/blob/main/results_2023-10-18T01-02-51.015590.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902982623,\n \"f1\": 0.058767827181208196,\n\
\ \"f1_stderr\": 0.0013192048135182055,\n \"acc\": 0.4471122977692914,\n\
\ \"acc_stderr\": 0.010713437247397681\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982623,\n\
\ \"f1\": 0.058767827181208196,\n \"f1_stderr\": 0.0013192048135182055\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13495072024260804,\n \
\ \"acc_stderr\": 0.009411315282571171\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224192\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat_v3.2_super
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|arc:challenge|25_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T01_02_51.015590
path:
- '**/details_harness|drop|3_2023-10-18T01-02-51.015590.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T01-02-51.015590.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T01_02_51.015590
path:
- '**/details_harness|gsm8k|5_2023-10-18T01-02-51.015590.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T01-02-51.015590.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hellaswag|10_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T01_02_51.015590
path:
- '**/details_harness|winogrande|5_2023-10-18T01-02-51.015590.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T01-02-51.015590.parquet'
- config_name: results
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- results_2023-09-05T08:28:49.460161.parquet
- split: 2023_10_18T01_02_51.015590
path:
- results_2023-10-18T01-02-51.015590.parquet
- split: latest
path:
- results_2023-10-18T01-02-51.015590.parquet
---
# Dataset Card for Evaluation run of openchat/openchat_v3.2_super
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v3.2_super
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2_super](https://huggingface.co/openchat/openchat_v3.2_super) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v3.2_super",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T01:02:51.015590](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2_super/blob/main/results_2023-10-18T01-02-51.015590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982623,
"f1": 0.058767827181208196,
"f1_stderr": 0.0013192048135182055,
"acc": 0.4471122977692914,
"acc_stderr": 0.010713437247397681
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982623,
"f1": 0.058767827181208196,
"f1_stderr": 0.0013192048135182055
},
"harness|gsm8k|5": {
"acc": 0.13495072024260804,
"acc_stderr": 0.009411315282571171
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224192
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tner/mit_movie_trivia | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: MIT Movie
---
# Dataset Card for "tner/mit_movie_trivia"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Dataset:** MIT Movie
- **Domain:** Movie
- **Number of Entity:** 12
### Dataset Summary
MIT Movie NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
- Entity Types: `Actor`, `Plot`, `Opinion`, `Award`, `Year`, `Genre`, `Origin`, `Director`, `Soundtrack`, `Relationship`, `Character_Name`, `Quote`
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
'tags': [0, 13, 14, 0, 0, 0, 3, 4, 4, 4, 4, 4, 4, 4, 4],
'tokens': ['a', 'steven', 'spielberg', 'film', 'featuring', 'a', 'bluff', 'called', 'devil', 's', 'tower', 'and', 'a', 'spectacular', 'mothership']
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/mit_movie_trivia/raw/main/dataset/label.json).
```python
{
"O": 0,
"B-Actor": 1,
"I-Actor": 2,
"B-Plot": 3,
"I-Plot": 4,
"B-Opinion": 5,
"I-Opinion": 6,
"B-Award": 7,
"I-Award": 8,
"B-Year": 9,
"B-Genre": 10,
"B-Origin": 11,
"I-Origin": 12,
"B-Director": 13,
"I-Director": 14,
"I-Genre": 15,
"I-Year": 16,
"B-Soundtrack": 17,
"I-Soundtrack": 18,
"B-Relationship": 19,
"I-Relationship": 20,
"B-Character_Name": 21,
"I-Character_Name": 22,
"B-Quote": 23,
"I-Quote": 24
}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|mit_movie_trivia |6816 | 1000| 1953|
|
Scalable-ML/testSet | ---
dataset_info:
features:
- name: app_temp
dtype: float64
- name: azimuth
dtype: float64
- name: clouds
dtype: int64
- name: datetime
dtype: string
- name: dewpt
dtype: float64
- name: dhi
dtype: int64
- name: dni
dtype: int64
- name: elev_angle
dtype: float64
- name: ghi
dtype: int64
- name: h_angle
dtype: 'null'
- name: pod
dtype: string
- name: precip
dtype: float64
- name: pres
dtype: int64
- name: revision_status
dtype: string
- name: rh
dtype: int64
- name: slp
dtype: int64
- name: snow
dtype: float64
- name: solar_rad
dtype: int64
- name: temp
dtype: float64
- name: timestamp_local
dtype: string
- name: timestamp_utc
dtype: string
- name: ts
dtype: int64
- name: uv
dtype: float64
- name: vis
dtype: float64
- name: weather
dtype: string
- name: wind_dir
dtype: int64
- name: wind_gust_spd
dtype: float64
- name: wind_spd
dtype: float64
splits:
- name: train
num_bytes: 2372841
num_examples: 7669
download_size: 0
dataset_size: 2372841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "testSet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_32_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8441660
num_examples: 26651
download_size: 4608297
dataset_size: 8441660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_32_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T14:48:47.168259](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble/blob/main/results_2023-08-23T14%3A48%3A47.168259.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.591031165151078,\n\
\ \"acc_stderr\": 0.033910448517384374,\n \"acc_norm\": 0.5951164394626702,\n\
\ \"acc_norm_stderr\": 0.033889044058760844,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.47461350738527963,\n\
\ \"mc2_stderr\": 0.015202805791318129\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216386,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6185022903804023,\n\
\ \"acc_stderr\": 0.004847615216473461,\n \"acc_norm\": 0.8228440549691296,\n\
\ \"acc_norm_stderr\": 0.0038102033089010925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.02450877752102842,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.02450877752102842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"\
acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"\
acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331806,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331806\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748927,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748927\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.015384352284543941,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.015384352284543941\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023337,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023337\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6029411764705882,\n \"acc_stderr\": 0.019794488900024124,\n \
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.019794488900024124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241748,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.47461350738527963,\n\
\ \"mc2_stderr\": 0.015202805791318129\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:48:47.168259.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T14_48_47.168259
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:48:47.168259.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:48:47.168259.parquet'
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-ensemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T14:48:47.168259](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-ensemble/blob/main/results_2023-08-23T14%3A48%3A47.168259.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.591031165151078,
"acc_stderr": 0.033910448517384374,
"acc_norm": 0.5951164394626702,
"acc_norm_stderr": 0.033889044058760844,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.47461350738527963,
"mc2_stderr": 0.015202805791318129
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216386,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6185022903804023,
"acc_stderr": 0.004847615216473461,
"acc_norm": 0.8228440549691296,
"acc_norm_stderr": 0.0038102033089010925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.02450877752102842,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.02450877752102842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331806,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331806
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748927,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543941,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543941
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023337,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023337
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.019794488900024124,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.019794488900024124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241748,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.47461350738527963,
"mc2_stderr": 0.015202805791318129
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tanli12/hagrid-classification-512p-no-gesture-150k-4p | ---
license: mit
---
|
ethz-spylab/rlhf_trojan_dataset | ---
extra_gated_prompt: >-
You acknowledge that generations from this model can be harmful, and that you
will not use them beyond this competition. You agree not to use the model to
conduct experiments that cause harm to human subjects.
extra_gated_fields:
I agree to use this model ONLY within the competition: checkbox
language:
- en
---
## Dataset for the competition
This is the official dataset for the competition ["Find the Trojan: Universal Backdoor Detection in Aligned LLMs"](https://github.com/ethz-spylab/rlhf_trojan_competition) hosted at SaTML 2024.
The dataset contains two splits: `train` and `test`. Participants should use the `train` split to execute their proposed methods and can use the `test` as a measure of how successful their search was. Competition prizes will be awarded based on performance on a **private test set**.
See the [official competition website](https://github.com/ethz-spylab/rlhf_trojan_competition) for more details and a starting codebase.
Competition organized by the [SPY Lab](https://spylab.ai) at ETH Zurich.
This dataset is created from a split of [this Anthropic dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf). |
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.2-yi-34b-200k](https://huggingface.co/cognitivecomputations/dolphin-2.2-yi-34b-200k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T04:55:41.011890](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-30T04-55-41.011890.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5429897039109348,\n\
\ \"acc_stderr\": 0.034024777660715086,\n \"acc_norm\": 0.5533854375327871,\n\
\ \"acc_norm_stderr\": 0.034866231322601235,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.45933703025376155,\n\
\ \"mc2_stderr\": 0.01568029542861706\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38822525597269625,\n \"acc_stderr\": 0.014241614207414037,\n\
\ \"acc_norm\": 0.4206484641638225,\n \"acc_norm_stderr\": 0.014426211252508403\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5128460466042621,\n\
\ \"acc_stderr\": 0.004988134303021787,\n \"acc_norm\": 0.6813383788090022,\n\
\ \"acc_norm_stderr\": 0.004650052150094422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.02983280811479601,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.02983280811479601\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504514,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504514\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.0266620105785671\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909895,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909895\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n\
\ \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\": 0.7352941176470589,\n\
\ \"acc_norm_stderr\": 0.0309645179269234\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n\
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.047928981709070624,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.047928981709070624\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.016155910721341774,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.016155910721341774\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602667,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602667\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325963,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325963\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839792,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839792\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038245,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038245\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.45933703025376155,\n\
\ \"mc2_stderr\": 0.01568029542861706\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6424625098658248,\n \"acc_stderr\": 0.01347000744392069\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0310841546626232,\n \
\ \"acc_stderr\": 0.004780296718393351\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.2-yi-34b-200k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|arc:challenge|25_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|arc:challenge|25_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|gsm8k|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|gsm8k|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hellaswag|10_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hellaswag|10_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T04-52-22.253489.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T04-55-41.011890.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- '**/details_harness|winogrande|5_2023-12-30T04-52-22.253489.parquet'
- split: 2023_12_30T04_55_41.011890
path:
- '**/details_harness|winogrande|5_2023-12-30T04-55-41.011890.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T04-55-41.011890.parquet'
- config_name: results
data_files:
- split: 2023_12_30T04_52_22.253489
path:
- results_2023-12-30T04-52-22.253489.parquet
- split: 2023_12_30T04_55_41.011890
path:
- results_2023-12-30T04-55-41.011890.parquet
- split: latest
path:
- results_2023-12-30T04-55-41.011890.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.2-yi-34b-200k](https://huggingface.co/cognitivecomputations/dolphin-2.2-yi-34b-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T04:55:41.011890](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-30T04-55-41.011890.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5429897039109348,
"acc_stderr": 0.034024777660715086,
"acc_norm": 0.5533854375327871,
"acc_norm_stderr": 0.034866231322601235,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.45933703025376155,
"mc2_stderr": 0.01568029542861706
},
"harness|arc:challenge|25": {
"acc": 0.38822525597269625,
"acc_stderr": 0.014241614207414037,
"acc_norm": 0.4206484641638225,
"acc_norm_stderr": 0.014426211252508403
},
"harness|hellaswag|10": {
"acc": 0.5128460466042621,
"acc_stderr": 0.004988134303021787,
"acc_norm": 0.6813383788090022,
"acc_norm_stderr": 0.004650052150094422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.02983280811479601,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.02983280811479601
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504514,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504514
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.047928981709070624,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.047928981709070624
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341774,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341774
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602667,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602667
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325963,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325963
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839792,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839792
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038245,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038245
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.45933703025376155,
"mc2_stderr": 0.01568029542861706
},
"harness|winogrande|5": {
"acc": 0.6424625098658248,
"acc_stderr": 0.01347000744392069
},
"harness|gsm8k|5": {
"acc": 0.0310841546626232,
"acc_stderr": 0.004780296718393351
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mirandal/image_descriptions_cleaned | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 63932199.0
num_examples: 1000
download_size: 63719358
dataset_size: 63932199.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-tweet_eval-offensive-f58805-30720144955 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: cardiffnlp/roberta-base-offensive
metrics: ['bertscore']
dataset_name: tweet_eval
dataset_config: offensive
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: cardiffnlp/roberta-base-offensive
* Dataset: tweet_eval
* Config: offensive
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@fabeelaalirawther@gmail.com](https://huggingface.co/fabeelaalirawther@gmail.com) for evaluating this model. |
FaalSa/cluster2 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 39452
num_examples: 1
- name: validation
num_bytes: 39932
num_examples: 1
- name: test
num_bytes: 40412
num_examples: 1
download_size: 134899
dataset_size: 119796
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
sudeepag/sampled-t0_fsopt_data | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: _template_idx
dtype: int64
- name: _task_source
dtype: string
- name: _task_name
dtype: string
- name: _template_type
dtype: string
splits:
- name: train
num_bytes: 19274572309.077045
num_examples: 6590315
download_size: 10861295931
dataset_size: 19274572309.077045
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
J4YL19/biored_tokenized_1 | ---
dataset_info:
features:
- name: pmid
dtype: string
- name: passage
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 2615456
num_examples: 387
- name: val
num_bytes: 700577
num_examples: 98
- name: test
num_bytes: 667744
num_examples: 97
download_size: 1084461
dataset_size: 3983777
---
# Dataset Card for "biored_tokenized_new"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sen_no_rikyu_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sen_no_rikyu/千利休/千利休 (Fate/Grand Order)
This is the dataset of sen_no_rikyu/千利休/千利休 (Fate/Grand Order), containing 89 images and their tags.
The core tags of this character are `blunt_bangs, black_eyes, hat, grey_hair, multicolored_hair, black_hair, gradient_hair, medium_hair, tassel, black_headwear, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 89 | 145.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_no_rikyu_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 89 | 120.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_no_rikyu_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 190 | 222.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_no_rikyu_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sen_no_rikyu_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_kimono, blue_flower, morning_glory, wide_sleeves, long_sleeves, holding, smile, simple_background, white_background, obi, closed_mouth |
| 1 | 7 |  |  |  |  |  | 1girl, black_kimono, looking_at_viewer, solo, wide_sleeves, blue_flower, disembodied_limb, obi, smile, holding, long_sleeves, morning_glory, cup, flower_knot |
| 2 | 8 |  |  |  |  |  | 1girl, red_kimono, solo, looking_at_viewer, red_headwear, short_hair, long_sleeves, obi, smile, wide_sleeves, folding_fan, holding_fan, simple_background, closed_mouth, disembodied_limb, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_kimono | blue_flower | morning_glory | wide_sleeves | long_sleeves | holding | smile | simple_background | white_background | obi | closed_mouth | disembodied_limb | cup | flower_knot | red_kimono | red_headwear | short_hair | folding_fan | holding_fan |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------------|:----------------|:---------------|:---------------|:----------|:--------|:--------------------|:-------------------|:------|:---------------|:-------------------|:------|:--------------|:-------------|:---------------|:-------------|:--------------|:--------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | | | X | X | | X | X | X | X | X | X | | | X | X | X | X | X |
|
Rayan2023/dataset | ---
license: openrail
---
|
aminlouhichi/test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 27904115.0
num_examples: 128
- name: validation
num_bytes: 5016141.0
num_examples: 22
- name: test
num_bytes: 5016141.0
num_examples: 22
download_size: 35791518
dataset_size: 37936397.0
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hipnotalamusz/AI_Assisted_Self_Images_With_Prompts_And_Personality_Tests | ---
task_categories:
- image-classification
- text-classification
language:
- en
tags:
- psychology
- art
pretty_name: Digital mirror of the soul
---
# Digital Mirror of the Soul - AI-Assisted Self-Images with Prompts and Psychological Questionnaires
This dataset originates from a study that examines the intersection of artificial intelligence, psychology, and art. It provides a comprehensive collection of AI-generated images and textual prompts from participants engaging in a task designed to express their self-image. This work is ideal for researchers in the fields of psychology, artificial intelligence, and art therapy, offering a novel dataset for exploring self-representation and the psychological dimensions of AI-assisted art creation.
## Dataset Details
### Dataset Description
This dataset comprises 18,219 images and 6,519 textual prompts created by 153 participants using the Midjourney v.4 and subsequently upgraded to v.5 AI software. Participants were tasked to create images that they believe are reflective of their personality, with a creation window limited to 45 minutes, meaning each entry is a participant's attempt to visualize aspects of their self-perception. In addition to image creation, participants completed a series of psychological questionnaires, detailed further in the description. They also engaged in a 15-20 minute interview with a psychology student, discussing the creation process, their images, and along with any thoughts, feelings, or memories evoked during the procedure. This dataset, containing the images, prompts, and results of various psychological questionnaires, supports a variety of research objectives, including the development of models to analyze visual and verbal self-expression, their development and temporal changes over the image creating session, and their potential use in inferring relevant psychological constructs ranging from body image, perfectionism, self-esteem, stability of identity, and BIG5 personality traits.
- **Curated by:** Klaus Kellerwessel (0009-0005-6420-5691) and Lilla Juhász
- **Funded by:** ÚNKP-23-2 New National Excellence Program of the Ministry for Culture and Innovation from the source of the Hungarian National Research, Development, and Innovation Fund.
- **Language(s) (NLP):** English
### Dataset Sources [Optional]
- **Repository:** [https://huggingface.co/datasets/Hipnotalamusz/AI_Assisted_Self_Images_With_Prompts_And_Personality_Tests](https://huggingface.co/datasets/Hipnotalamusz/AI_Assisted_Self_Images_With_Prompts_And_Personality_Tests)
- **Paper [Optional]:** Under revision - we will add it later
## Uses
The dataset is intended for academic researchers and practitioners interested in the cross-disciplinary areas of AI, psychology, and art therapy. It offers a unique dataset for studying the nuances of self-representation, providing a basis for both quantitative and qualitative analyses. Researchers interested in machine learning, gender studies, and the psychological impact of AI on art creation will find this dataset particularly useful. It facilitates a deeper understanding of the role of AI in art therapy practices and the broader implications for psychological research.
### Direct Use
- Developing AI-driven psychological assessment tools that interpret visual and textual data.
- Investigating the nuances of identity expression through digital art.
- Enhancing art therapy practices with AI technology.
### Out-of-Scope Use
The dataset is designed for scholarly research and is not intended for commercial use or any applications that could compromise the privacy or anonymity of the participants. Ethical guidelines should be strictly followed to ensure respectful and responsible use of the data.
## Dataset Structure
The dataset is structured with comprehensive metadata for each participant's AI-generated images and textual prompts. Here's a guide to navigating and utilizing this rich dataset:
### Overview
The dataset includes multiple rows for each participant, where each row corresponds to an image generated from a single text prompt. The columns encompass demographic information, psychological assessments, and details related to the image creation process.
### Columns Description
- **Participant_ID:** Each participant created a unique identifier for themselves. This was necessary because participants filled in the questionnaires online one day before the image creation procedure, and we needed to connect the questionnaire results to the images and interviews somehow. We opted not to generate an ID from the participants' names due to privacy concerns, and this self-chosen approach seemed to work. Since the participants were all Hungarian, they sometimes used Hungarian words like "kémény" (chimney), "teknős" (turtle), and some in English (e.g., "soviet cat") or seemingly nonsensical (e.g., "t2ki7m") IDs also appear.
- **Image number:** A sequential number indicating the order of the image generated by the participant. Images generated from the same prompt at the same session (see Event type) share their Image number.
- **Text Prompt:** The textual description provided by the participant to generate the image in natural language.
- **Event type:** Describes the nature of the image generation event: "imagine" for initial creations, "variation" for variations of an initial image, and "upscale" for a bigger and more detailed version of an initial image. The Midjourney program generates 4 images per 1 prompt for the initial image type "imagine" and their "variations", resulting in these images sharing image numbers in groups of 4 (see Image numbers).
- **file_name:** The name of the file corresponding to the generated image, encapsulating the participant ID, image number, and the first 40 characters of the text prompt used. We also added some random characters to the end to ensure that every image's file_name is unique.
- **Gender:** Participant's gender (1 indicates female, 0 male - we will add the data of our non-binary participants later).
- **Age:** Participant's age.
- **Highest level of education:** Coded value representing the participant's highest level of education achieved.
And various psychological measures, including:
- **Rosenberg self-esteem** The Rosenberg Self-Esteem Scale is a widely used tool for assessing an individual's self-esteem. It consists of 10 items designed to measure both positive and negative feelings about the self. The scale is scored on a four-point Likert scale, ranging from strongly agree to strongly disagree, with higher scores indicating higher self-esteem. Participants with high self-concept clarity might produce images and prompts that are more consistent and coherent, reflecting a stable and well-defined sense of self. (Rosenberg Self-esteem Scale: Horváth, Zs., Urbán, R., Kökönyei, Gy., Demetrovics, Zs. (2022). Kérdőíves módszerek a klinikai és egészségpszichológiai kutatásban és gyakorlatban I. Budapest: Medicina könyvkiadó.)
- **Extraversion/Big5 to Openness/Big5** This shortened version of the Big Five Personality Test measures five key dimensions of personality: Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness. Each dimension is assessed with two items, offering a brief yet effective insight into an individual's personality traits.
The Big Five traits could offer a nuanced understanding of the themes and motifs chosen in the art creation process. For instance, high Openness might be linked to more creative and diverse prompts, while high Extraversion could relate to more socially engaging or dynamic content.
(10-item Personality Inventory (Big5 – shortened version): Chiorri, C., Bracco, F., Piccinno, T., Modafferi, C., & Battini, V. (2015). Psychometric properties of a revised version of the Ten Item Personality Inventory. European Journal of Psychological Assessment.)
- **Self-concept clarity** The Self-Concept Clarity Scale assesses the extent to which an individual's self-concept is clearly and confidently defined, internally consistent, and stable over time. High scores indicate a clear and confident self-concept. Participants with high self-concept clarity might produce images and prompts that are more consistent and coherent, reflecting a stable and well-defined sense of self.
(Self-Concept Clarity Sale: Hargitai, R., Rózsa, S., Hupuczi, E., Birkás, B., Hartung, I., Hartungné Somlai, E., ... & Kállai, J. (2021). Énkép egyértelműség mérése és korrelátumai. Magyar Pszichológiai Szemle, 75(4), 557-580.)
- **Beck depression** The Beck Depression Inventory is a 21-item self-report inventory, one of the most widely used instruments for measuring the severity of depression. Each item is scored on a scale from 0 to 3, with higher total scores indicating more severe depressive symptoms.
Depression scores could influence the emotional tone of the generated images and prompts. Higher scores might be associated with themes of sadness, isolation, or other negative emotional expressions.
(Beck Depression Inventory (BDI): 75 Papír-Ceruza teszt. Pszicho-ped Bt. - https://animula.hu/konyv/75-papir-ceruza-teszt )
- **Interpersonal.../Ego Identity Status to Ideological achieved identity/Ego Identity Status** This assessment tool measures Ego Identity Status across different domains, including interpersonal relations and ideological commitments. It categorizes identity status into diffusion, foreclosure, moratorium, and achievement, providing insight into the individual's identity exploration and commitment processes. Identity status may impact the thematic diversity and depth of participants' creations. Those in the achievement status might exhibit a greater variety of themes, reflecting a well-explored sense of identity.
(Extended Objective Measure of EGO Identity Status II. /EOM-EIS II.: Jámbori, Sz., Kőrössy, J. (2019). A szándékos önsza-bályozás jelentősége serdülő és fiatal felnőttkorban a társas támogatás, az identitásállapotok és a reziliencia tükrében. Alkalma-zott Pszichológia 19(3): 33-52.)
- **Standards/Perfectionism to Discrepancy/Perfectionism** This scale assesses perfectionism by measuring standards and discrepancy aspects. High standards reflect the setting of high personal performance standards, while discrepancy refers to perceived shortcomings in meeting those standards. Perfectionism scores, especially high discrepancy, might relate to how participants critique their own creations or the iterative process of refining their images through variations.
(Almost Perfect Scale (perfectionism): Horváth, Zs., Urbán, R., Kökönyei, Gy., Demetrovics, Zs. (2022). Kérdőíves módszerek a klinikai és egészségpszichológiai kutatásban és gyakorlatban I. Budapest: Medicina könyvkiadó.)
- **Total/Body Image to Rest/Body image** This questionnaire assesses body image across several dimensions, including general satisfaction with one's body, evaluation of body size, knowledge about one's body, and attitudes toward specific body parts or aspects. Body image scores could influence how participants choose to represent themselves or others in their images. Issues with body image might lead to avoiding personal representation or altering aspects of appearance in the generated art.
(Personal body attitudes questionnaire: Horváth, Zs., Urbán, R., Kökönyei, Gy., Demetrovics, Zs. (2022). Kérdőíves módszerek a klinikai és egészségpszichológiai kutatásban és gyakorlatban I. Budapest: Medicina könyvkiadó.)
### Navigating and Utilizing the Dataset
- **Participant Analysis:** Isolate data for individual participants using the Participant_ID column for qualitative case studies or aggregate data across participants for broader analyses.
- **Image Type and Number:** Key to understanding the context of each image. Variations or upscales from an image might reflect a participant's preference for continuing to work on this particular image, assuming that the upscaled and variated images might contain more useful information than the more accidental "imagine" types. The image number enables us to investigate the dynamic processes of image creation, the shifts of focus, and experimentations.
- **Text Prompts:** Explore thematic (categories, topics used in self-description) or formal (linguistic, stylistic) features of the prompts, their changes in the creation process, or correlations with the psychological measures provided.
- **Images:** Analyze thematic (content of the image) or formal (colors, brightness, composition, edge density, etc.) features of the images to infer different psychological and demographic data.
- **Statistical and Machine Learning Analyses:** Use the dataset for both traditional statistical analyses and advanced machine learning models (both supervised and unsupervised) to explore underlying patterns and correlations in how participants express themselves through AI-generated art.
## Dataset Creation
### Curation Rationale
As passive recipients, we often differentiate sharply between AI and human-made artworks. However, studies show that when individuals personally engage with AI art softwares, they tend to view the creative process as collaborative, feeling a sense of ownership over the finished works. This type of involvement and sense of personal connection can serve as a basis for the use of these tools in art therapy and even psychometrics. This dataset primarily investigates how individuals can express complex psychological states through AI-generated art - and whether it is possible to deduce them from only the images and the prompts used.
### Source Data
Data was collected in a controlled study environment, with participants guided through the process of creating AI-assisted art.
#### Data Collection and Processing
The test session, which lasts between 80 and 90 minutes, involves only the examiner and the participant. The procedure begins with a brief tutorial on the image-generation software (Midjourney®), where an example image is generated using a 'Dogs and flowers' prompt. After obtaining informed consent, participants are instructed to take self-portraits for 45 minutes, using natural language prompts as guided by the following standardized instruction:
> "I would like to ask you to try to take pictures of yourself that express who you are and how you feel about yourself. These images do not have to be lifelike, but they can be. They can be based on several basic ideas, embedded in scenes or situations, and can deviate from reality as much as you like. During the image creation process, you can specify artistic styles, play with the format, composition, lighting, and colours. The goal is to produce as many images as possible that you feel capture something of your personality."
During image creation, participants are encouraged to ask technical questions and may use the online DeepL translator to overcome language barriers, given that the participants were Hungarian and the image creation process was conducted in English. However, they receive no further assistance; for example, they cannot use a mirror or a specific image found on their phone as a template. The 45 minutes starts after the first successfully completed image pack is loaded, and the examiner gives a signal five minutes before it runs out. Upon completing the 45-minute period, participants are allowed to finish the prompt they have already started but are not permitted to initiate new ones. The image-making phase was followed by a 15-20 minute semi-structured interview, guided by the following questions:
- How are you feeling now?
- What was it like to go through the task?
- Was the 45-minute duration enough, or did you feel it was too much or too little?
- Did you experience a state of flow? Were you able to settle in?
- What goals did you set for yourself?
- Did you have any strategy or did you simply allow yourself to associate freely?
- How difficult was it for you to try to define yourself for 45 minutes?
- How do you relate to the finished images?
- Which ones do you feel are the most expressive of yourself?
- Which ones the least?
- Which one do you think your best friend would find the most expressive of you?
- How does it feel to look through the pictures now?
- Have you had any realisations in terms of self-knowledge?
- Did you encounter any drawbacks or difficulties in using the program or during the test-taking process?
- What would you change if you could start over?
#### Who are the Source Data Producers?
All of our participants were Hungarian young adults between 18 and 28 years old. They have been anonymized, with their questionnaires, interviews, and images linked solely through the unique ID each participant selected.
## Bias, Risks, and Limitations
The dataset's interpretations should be made with caution, considering the socio-cultural context of the participants and the influence of AI technology on artistic expression. Ethical considerations are paramount, especially concerning participant privacy and the interpretation of artistic expressions.
### Recommendations
Researchers are encouraged to approach the dataset with a multidisciplinary perspective, integrating insights from psychology, artificial intelligence, and art theory. This dataset offers a unique opportunity to explore the boundaries of AI-mediated human expression and its implications for psychological research and practice. Careful, ethical analysis can lead to significant advancements in our understanding of AI as a tool for self-exploration and expression in both clinical and research settings.
## Citation [Optional]
_Coming soon!_
**BibTeX:**
**APA:**
## Dataset Card Authors [Optional]
Klaus Kellerwessel (Eötvös Loránd Tudományegyetem, Budapest; University of Pannonia, Veszprém)
## Dataset Card Contact
kellerwesselklaus@gmail.com
|
sinhala-nlp/HelaTransformer | ---
license: unknown
---
|
saibo/bookcorpus_compact_1024_shard5_of_10 | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
splits:
- name: train
num_bytes: 739992156
num_examples: 61605
download_size: 372896291
dataset_size: 739992156
---
# Dataset Card for "bookcorpus_compact_1024_shard5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanF-MarkrAI/new_instruct_concated | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 2184658366
num_examples: 657763
download_size: 1108568226
dataset_size: 2184658366
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FlyCole/Dream2Real | ---
license: cc-by-nc-sa-4.0
---
|
sezosan/arc_tr_s4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 50638
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "arc_tr_s4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reza-alipour/muse-landmark-0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
- name: caption_fre
dtype: string
- name: caption_deu
dtype: string
- name: caption_ita
dtype: string
- name: caption_spa
dtype: string
- name: generated_mask
dtype: image
splits:
- name: train
num_bytes: 578425461.5
num_examples: 1500
download_size: 230627940
dataset_size: 578425461.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kamyar-zeinalipour/EN_CW | ---
dataset_info:
features:
- name: date
dtype: string
- name: answer
dtype: string
- name: clue
dtype: string
- name: partial
dtype: bool
- name: couple_occurencies
dtype: int64
splits:
- name: train
num_bytes: 387434957
num_examples: 7327448
download_size: 188270614
dataset_size: 387434957
---
# Dataset Card for "EN_CW"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amphora/lmsys-filtered | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: model
dtype: string
- name: conversation
dtype: string
- name: turn
dtype: int64
- name: language
dtype: string
- name: openai_moderation
dtype: string
- name: redacted
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 317822351
num_examples: 62968
download_size: 122101594
dataset_size: 317822351
---
# Dataset Card for "lmsys-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/natural-language-satisfiability | ---
task_categories:
- text-classification
language:
- en
task_ids:
- natural-language-inference
---
```bibtex
@misc{https://doi.org/10.48550/arxiv.2211.05417,
doi = {10.48550/ARXIV.2211.05417},
url = {https://arxiv.org/abs/2211.05417},
author = {Schlegel, Viktor and Pavlov, Kamen V. and Pratt-Hartmann, Ian},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Can Transformers Reason in Fragments of Natural Language?},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` |
result-kand2-sdxl-wuerst-karlo/ddb740fe | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 201
num_examples: 10
download_size: 1398
dataset_size: 201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ddb740fe"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TahmidH/Bengali_Sentence_Construction | ---
license: cc0-1.0
language:
- bn
size_categories:
- 1K<n<10K
--- |
Maxwell001/skill_model_formatted_2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 444276
num_examples: 1023
download_size: 187166
dataset_size: 444276
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nonnon/test | ---
license: other
---
|
nyanko7/LLaMA-65B | ---
license: openrail
---
|
zolak/twitter_dataset_50_1713129456 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 264030
num_examples: 604
download_size: 133659
dataset_size: 264030
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edumunozsala/dpo-hate-speech-es | ---
dataset_info:
features:
- name: input
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1802231
num_examples: 3572
download_size: 994667
dataset_size: 1802231
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
M-AI-C/en-tafsir-mokhtasar | ---
dataset_info:
features:
- name: ayah
dtype: int64
- name: sorah
dtype: int64
- name: sentence
dtype: string
- name: en-tafsir-mokhtasar-html
dtype: string
- name: en-tafsir-mokhtasar-text
dtype: string
splits:
- name: train
num_bytes: 4976739
num_examples: 6235
download_size: 2479802
dataset_size: 4976739
---
# Dataset Card for "en-tafsir-mokhtasar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harshithvh/llama2_format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 411866
num_examples: 251
download_size: 88988
dataset_size: 411866
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
argilla/dolly-curated-comparison-falcon-7b-instruct | ---
language: en
dataset_info:
features:
- name: prompt
dtype: string
- name: response-1
dtype: string
- name: response-2
dtype: string
- name: category
dtype: string
- name: original_response
dtype: string
- name: external_id
dtype: int64
splits:
- name: train
num_bytes: 10328235
num_examples: 7401
download_size: 6598297
dataset_size: 10328235
---
# Dataset Card for "dolly-curated-comparison-falcon-7b-instruct"
This dataset contains two generated responses using the `falcon-7b-instruct` model and the original, curated, prompt + responses from the Dolly v2 curated dataset. For now only 50% of the original dataset is available but we plan to complete it.
This dataset can be used for training a reward model for RLHF using [Argilla Feedback](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/conceptual_guides.html)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UnstableJeje/donald | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 184278.0
num_examples: 21
download_size: 185347
dataset_size: 184278.0
---
# Dataset Card for "donald"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kor_sarcasm | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ko
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: Korean Sarcasm Detection
tags:
- sarcasm-detection
dataset_info:
features:
- name: tokens
dtype: string
- name: label
dtype:
class_label:
names:
'0': no_sarcasm
'1': sarcasm
splits:
- name: train
num_bytes: 1012030
num_examples: 9000
- name: test
num_bytes: 32480
num_examples: 301
download_size: 1008955
dataset_size: 1044510
---
# Dataset Card for Korean Sarcasm Detection
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Korean Sarcasm Detection](https://github.com/SpellOnYou/korean-sarcasm)
- **Repository:** [Korean Sarcasm Detection](https://github.com/SpellOnYou/korean-sarcasm)
- **Point of Contact:** [Dionne Kim](jiwon.kim.096@gmail.com)
### Dataset Summary
The Korean Sarcasm Dataset was created to detect sarcasm in text, which can significantly alter the original meaning of a sentence. 9319 tweets were collected from Twitter and labeled for `sarcasm` or `not_sarcasm`. These tweets were gathered by querying for: `역설, 아무말, 운수좋은날, 笑, 뭐래 아닙니다, 그럴리없다, 어그로, irony sarcastic, and sarcasm`. The dataset was pre-processed by removing the keyword hashtag, urls and mentions of the user to maintain anonymity.
### Supported Tasks and Leaderboards
* `sarcasm_detection`: The dataset can be used to train a model to detect sarcastic tweets. A [BERT](https://huggingface.co/bert-base-uncased) model can be presented with a tweet in Korean and be asked to determine whether it is sarcastic or not.
### Languages
The text in the dataset is in Korean and the associated is BCP-47 code is `ko-KR`.
## Dataset Structure
### Data Instances
An example data instance contains a Korean tweet and a label whether it is sarcastic or not. `1` maps to sarcasm and `0` maps to no sarcasm.
```
{
"tokens": "[ 수도권 노선 아이템 ] 17 . 신분당선의 #딸기 : 그의 이미지 컬러 혹은 머리 색에서 유래한 아이템이다 . #메트로라이프"
"label": 0
}
```
### Data Fields
* `tokens`: contains the text of the tweet
* `label`: determines whether the text is sarcastic (`1`: sarcasm, `0`: no sarcasm)
### Data Splits
The data is split into a training set comrpised of 9018 tweets and a test set of 301 tweets.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The dataset was created by gathering HTML data from Twitter. Queries for hashtags that include sarcasm and variants of it were used to return tweets. It was preprocessed by removing the keyword hashtag, urls and mentions of the user to preserve anonymity.
#### Who are the source language producers?
The source language producers are Korean Twitter users.
### Annotations
#### Annotation process
Tweets were labeled `1` for sarcasm and `0` for no sarcasm.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
Mentions of the user in a tweet were removed to keep them anonymous.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was curated by Dionne Kim.
### Licensing Information
This dataset is licensed under the MIT License.
### Citation Information
```
@misc{kim2019kocasm,
author = {Kim, Jiwon and Cho, Won Ik},
title = {Kocasm: Korean Automatic Sarcasm Detection},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/SpellOnYou/korean-sarcasm}}
}
```
### Contributions
Thanks to [@stevhliu](https://github.com/stevhliu) for adding this dataset. |
wanng/midjourney-v5-202304-clean | ---
license: apache-2.0
task_categories:
- text-to-image
- image-to-text
language:
- en
tags:
- midjourney
---
# midjourney-v5-202304-clean
## 简介 Brief Introduction
非官方的,爬取自midjourney v5的2023年4月的数据,一共1701420条。
Unofficial, crawled from midjourney v5 for April 2023, 1,701,420 pairs in total.
## 数据集信息 Dataset Information
原始项目地址:https://huggingface.co/datasets/tarungupta83/MidJourney_v5_Prompt_dataset
我做了一些清洗,清理出了两个文件:
- ori_prompts_df.parquet (1,255,812对,midjourney的四格图)

- upscaled_prompts_df.parquet (445,608对,使用了高清指令的图,这意味着这个图更受欢迎。)

Original project address: https://huggingface.co/datasets/tarungupta83/MidJourney_v5_Prompt_dataset
I did some cleaning and cleaned out two files:
- ori_prompts_df.parquet (1,255,812 pairs, midjourney's four-frame diagrams)
- upscaled_prompts_df.parquet (445,608 pairs, graphs that use the Upscale command, which means this one is more popular.)
|
manishiitg/camel-ai-chemistry | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 159637053
num_examples: 40000
download_size: 52279570
dataset_size: 159637053
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ashisu_sahoto_mangakasantoassistantsanto | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ashisu Sahoto
This is the dataset of Ashisu Sahoto, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 407 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 407 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 407 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 407 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Meunomeejohn/Geto | ---
license: openrail
---
|
CyberHarem/tiamat_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tiamat (Fire Emblem)
This is the dataset of tiamat (Fire Emblem), containing 55 images and their tags.
The core tags of this character are `long_hair, red_hair, green_eyes, braid, very_long_hair, single_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 55 | 58.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 55 | 36.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 94 | 61.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 55 | 52.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 94 | 82.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tiamat_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, large_breasts, looking_at_viewer, solo, beach, cleavage, day, ocean, outdoors, smile, ass, blush, one-piece_swimsuit, sky, white_bikini |
| 1 | 24 |  |  |  |  |  | 1girl, solo, breastplate, white_background, looking_at_viewer, gauntlets, smile, spear |
| 2 | 9 |  |  |  |  |  | 1girl, necklace, smile, solo, dress, head_wreath, hair_flower, holding_staff, long_sleeves, boots, full_body, looking_at_viewer, simple_background |
| 3 | 5 |  |  |  |  |  | 1girl, 1boy, blush, hetero, large_breasts, nipples, censored, cum_in_pussy, female_pubic_hair, penis, sex, completely_nude, heart, navel, open_mouth, solo_focus, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | large_breasts | looking_at_viewer | solo | beach | cleavage | day | ocean | outdoors | smile | ass | blush | one-piece_swimsuit | sky | white_bikini | breastplate | white_background | gauntlets | spear | necklace | dress | head_wreath | hair_flower | holding_staff | long_sleeves | boots | full_body | simple_background | 1boy | hetero | nipples | censored | cum_in_pussy | female_pubic_hair | penis | sex | completely_nude | heart | navel | open_mouth | solo_focus | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:--------------------|:-------|:--------|:-----------|:------|:--------|:-----------|:--------|:------|:--------|:---------------------|:------|:---------------|:--------------|:-------------------|:------------|:--------|:-----------|:--------|:--------------|:--------------|:----------------|:---------------|:--------|:------------|:--------------------|:-------|:---------|:----------|:-----------|:---------------|:--------------------|:--------|:------|:------------------|:--------|:--------|:-------------|:-------------|:----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 24 |  |  |  |  |  | X | | X | X | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | X | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
speed1/rk | ---
license: openrail
---
|
open-llm-leaderboard/details_chinoll__Yi-7b-dpo | ---
pretty_name: Evaluation run of chinoll/Yi-7b-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chinoll/Yi-7b-dpo](https://huggingface.co/chinoll/Yi-7b-dpo) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chinoll__Yi-7b-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:10:38.355372](https://huggingface.co/datasets/open-llm-leaderboard/details_chinoll__Yi-7b-dpo/blob/main/results_2023-12-04T16-10-38.355372.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274780891690785,\n\
\ \"acc_stderr\": 0.03214198982171106,\n \"acc_norm\": 0.6382309545732996,\n\
\ \"acc_norm_stderr\": 0.03286487964348697,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4551491788416383,\n\
\ \"mc2_stderr\": 0.014826375266749701\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39505119453924914,\n \"acc_stderr\": 0.014285898292938172,\n\
\ \"acc_norm\": 0.4308873720136519,\n \"acc_norm_stderr\": 0.014471133392642475\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5570603465445131,\n\
\ \"acc_stderr\": 0.004957182635381807,\n \"acc_norm\": 0.7452698665604461,\n\
\ \"acc_norm_stderr\": 0.004348189459336535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361009,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361009\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361276,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n\
\ \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n\
\ \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967409,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967409\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794086,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794086\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407004,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407004\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n\
\ \"acc_stderr\": 0.012764981829524265,\n \"acc_norm\": 0.48565840938722293,\n\
\ \"acc_norm_stderr\": 0.012764981829524265\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02902942281568139,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02902942281568139\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4551491788416383,\n\
\ \"mc2_stderr\": 0.014826375266749701\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268736\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \
\ \"acc_stderr\": 0.008744810131034042\n }\n}\n```"
repo_url: https://huggingface.co/chinoll/Yi-7b-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-10-38.355372.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- '**/details_harness|winogrande|5_2023-12-04T16-10-38.355372.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-10-38.355372.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_10_38.355372
path:
- results_2023-12-04T16-10-38.355372.parquet
- split: latest
path:
- results_2023-12-04T16-10-38.355372.parquet
---
# Dataset Card for Evaluation run of chinoll/Yi-7b-dpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chinoll/Yi-7b-dpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chinoll/Yi-7b-dpo](https://huggingface.co/chinoll/Yi-7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chinoll__Yi-7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:10:38.355372](https://huggingface.co/datasets/open-llm-leaderboard/details_chinoll__Yi-7b-dpo/blob/main/results_2023-12-04T16-10-38.355372.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6274780891690785,
"acc_stderr": 0.03214198982171106,
"acc_norm": 0.6382309545732996,
"acc_norm_stderr": 0.03286487964348697,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4551491788416383,
"mc2_stderr": 0.014826375266749701
},
"harness|arc:challenge|25": {
"acc": 0.39505119453924914,
"acc_stderr": 0.014285898292938172,
"acc_norm": 0.4308873720136519,
"acc_norm_stderr": 0.014471133392642475
},
"harness|hellaswag|10": {
"acc": 0.5570603465445131,
"acc_stderr": 0.004957182635381807,
"acc_norm": 0.7452698665604461,
"acc_norm_stderr": 0.004348189459336535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.02571523981134676,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.02571523981134676
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361009,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361009
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361276,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967409,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794086,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794086
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040697,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407004,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407004
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816657,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816657
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48565840938722293,
"acc_stderr": 0.012764981829524265,
"acc_norm": 0.48565840938722293,
"acc_norm_stderr": 0.012764981829524265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02902942281568139,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02902942281568139
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4551491788416383,
"mc2_stderr": 0.014826375266749701
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268736
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034042
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NLUHOPOE__test-case-6 | ---
pretty_name: Evaluation run of NLUHOPOE/test-case-6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/test-case-6](https://huggingface.co/NLUHOPOE/test-case-6) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__test-case-6\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T01:53:21.380144](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-6/blob/main/results_2024-03-01T01-53-21.380144.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5803688532480622,\n\
\ \"acc_stderr\": 0.03360323649201099,\n \"acc_norm\": 0.5854412322412027,\n\
\ \"acc_norm_stderr\": 0.03430537577014263,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241473,\n \"mc2\": 0.49443365843236614,\n\
\ \"mc2_stderr\": 0.01529725057307427\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.01459587320535827,\n\
\ \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5995817566221868,\n\
\ \"acc_stderr\": 0.004889817489739686,\n \"acc_norm\": 0.7885879306910973,\n\
\ \"acc_norm_stderr\": 0.004074754687134535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013316,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013316\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549655,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562424,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278246,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278246\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.01787121776779024,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.01787121776779024\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n\
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937153,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261431,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261431\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n\
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934016,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934016\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159613,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159613\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n\
\ \"acc_stderr\": 0.012573836633799008,\n \"acc_norm\": 0.41264667535853977,\n\
\ \"acc_norm_stderr\": 0.012573836633799008\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227477,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227477\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.033333333333333354,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.033333333333333354\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241473,\n \"mc2\": 0.49443365843236614,\n\
\ \"mc2_stderr\": 0.01529725057307427\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33206974981046244,\n \
\ \"acc_stderr\": 0.012972465034361867\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/test-case-6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-21.380144.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-53-21.380144.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- '**/details_harness|winogrande|5_2024-03-01T01-53-21.380144.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T01-53-21.380144.parquet'
- config_name: results
data_files:
- split: 2024_03_01T01_53_21.380144
path:
- results_2024-03-01T01-53-21.380144.parquet
- split: latest
path:
- results_2024-03-01T01-53-21.380144.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/test-case-6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/test-case-6](https://huggingface.co/NLUHOPOE/test-case-6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__test-case-6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T01:53:21.380144](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-6/blob/main/results_2024-03-01T01-53-21.380144.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5803688532480622,
"acc_stderr": 0.03360323649201099,
"acc_norm": 0.5854412322412027,
"acc_norm_stderr": 0.03430537577014263,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241473,
"mc2": 0.49443365843236614,
"mc2_stderr": 0.01529725057307427
},
"harness|arc:challenge|25": {
"acc": 0.523037542662116,
"acc_stderr": 0.01459587320535827,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.5995817566221868,
"acc_stderr": 0.004889817489739686,
"acc_norm": 0.7885879306910973,
"acc_norm_stderr": 0.004074754687134535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013316,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013316
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549655,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070435,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070435
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562424,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278246,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278246
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.01787121776779024,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.01787121776779024
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937153,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261431,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261431
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934016,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934016
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159613,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159613
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799008,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227477,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227477
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033333333333333354,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033333333333333354
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241473,
"mc2": 0.49443365843236614,
"mc2_stderr": 0.01529725057307427
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
},
"harness|gsm8k|5": {
"acc": 0.33206974981046244,
"acc_stderr": 0.012972465034361867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SuryaKrishna02/aya-telugu-food-recipes | ---
annotations_creators:
- expert-generated
language:
- te
language_creators:
- expert-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: Telugu Food Recipes
size_categories:
- n<1K
source_datasets:
- original
tags:
- food
- recipes
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Summary
`aya-telugu-food-recipes` is an open source dataset of instruct-style records generated by webscraping a Telugu food recipes website. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License.
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Telugu Version: 1.0
# Dataset Overview
`aya-telugu-food-recipes` is a corpus of more than 400 records generated by webscraping of the Telugu Food Recipes Website. This Dataset can be used for the following task:
- Given the name of the food item, generates the detailed recipe along with the ingredients.
# Intended Uses
While immediately valuable for instruction fine tuning large language models, as a corpus of instruction prompts, this dataset also presents a valuable opportunity for synthetic data generation in the methods. For example, prompt-completions could be submitted as few-shot examples to a large open language model to generate additional food recipes.
# Dataset
## Load with Datasets
To load this dataset with Datasets, you'll just need to install Datasets as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset('SuryaKrishna02/aya-telugu-food-recipes')
```
## Purpose of Collection
Telugu is a low-resource language where there no food recipes instruct-style dataset to the best of my knowledge. This was created as a part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI to make sure Telugu is well represented in the space of AI/ML. Unlike other datasets that are limited to non-commercial use, this dataset can be used, modified, and extended for any purpose, including academic or commercial applications.
## Sources
- **Andhrajyothi Website**: Performed webscraping from [Andhrajyothi Website](https://www.andhrajyothy.com/vantalu) which is a famous telugu website consisting of food recipes in the following categories:
1. తీపి వంటలు
2. పచ్చళ్లు
3. శాకాహారం
4. మాంసాహారం
- Next, performed some pre-processing of the data like removing unwanted characters, extracting ingredients and cooking instructions seperately from the scraped data. Finally, converted the scraped data into Instruct-style prompts and completions.
## Data Fields
- `inputs` : Prompt or input to the language model.
- `targets` : Completion or output of the language model.
- `template_id` : Id of the template used in `inputs` and `targets`.
- `template_lang`: ISO code of the language used in the `inputs` and `targets` where *tel* refers to Telugu.
## Templates
For the creation of instruct-style prompts and completions from the scraped data, the following one template category with 7 different templates were used:
1. Given the name of the food item, generates the detailed recipe along with the ingredients.
| template_id | inputs | targets |
|-------------|--------|---------|
| 1 | ```{{Food Item}} ఎలా తయారు చేస్తారో క్లుప్తంగా ఇవ్వండి.``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
| 2 | ```మీరు {{Food Item}} తయారు చెయ్యడానికి ఎటువంటి తిండి పదార్ధాలు వాడుతురు మరియు ఏ విధముగా చేస్తారో వివరణ ఇవ్వండి.``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
| 3 | ```నేను {{Food Item}} చాలా రుచికరంగా ఉంటుంది అని విన్నాను. నాకు ఇది ఎలా చెయ్యాలో సంక్లిప్తంగా చెప్పు.``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
| 4 | ```మొదటిసారి వంట చేసేవారికి చెప్పినట్టు నాకు {{Food Item}} ఎలా చెయ్యాలొ చెప్పు``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
| 5 | ```{{Food Item}} ఎలా చేయాలి? సమాధానం లో కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఉండాలి.``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
| 6 | ```{{Food Item}} ఎలా తయారు చేస్తాం?``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
| 7 | ```{{Food Item}} రెసిపీ ఏంటి?``` | ```{{Food Item}} కి కావలసిన పదార్ధాలు మరియు తయారు చేసే విధానం ఇక్కడ ఇవ్వబడింది.\nకావలసిన పదార్థాలు:\n{{Ingredients}}\n\nతయారుచేసే విధానం:\n{{Cooking Instructions}}``` |
## Personal or Sensitive Data
This dataset contains public information. To our knowledge, there are no private person’s personal identifiers or sensitive information.
## Language
Telugu
# Known Limitations
- The Dataset is scraped from the food recipes website and the contents of this dataset may reflect the bias.
- Although there is utmost care taken to keep the dataset as monolingual, there might be some records that may contain English Language along with Telugu.
# Contributors
[SuryaKrishna02](https://github.com/SuryaKrishna02) and [Desik98](https://github.com/desik1998) |
joey234/mmlu-human_sexuality-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 49813
num_examples: 131
download_size: 34784
dataset_size: 49813
---
# Dataset Card for "mmlu-human_sexuality-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Salesforce/cloudops_tsf | ---
license: cc-by-4.0
task_categories:
- time-series-forecasting
pretty_name: cloud
size_categories:
- 100M<n<1B
---
# Pushing the Limits of Pre-training for Time Series Forecasting in the CloudOps Domain
[Paper](https://arxiv.org/abs/2310.05063) | [Code](https://github.com/SalesforceAIResearch/pretrain-time-series-cloudops)
Datasets accompanying the paper "Pushing the Limits of Pre-training for Time Series Forecasting in the CloudOps Domain".
## Quick Start
### azure_vm_traces_2017
```python
from datasets import load_dataset
dataset = load_dataset('Salesforce/cloudops_tsf', 'azure_vm_traces_2017')
print(dataset)
DatasetDict({
train_test: Dataset({
features: ['start', 'target', 'item_id', 'feat_static_cat', 'feat_static_real', 'past_feat_dynamic_real'],
num_rows: 17568
})
pretrain: Dataset({
features: ['start', 'target', 'item_id', 'feat_static_cat', 'feat_static_real', 'past_feat_dynamic_real'],
num_rows: 159472
})
})
```
### borg_cluster_data_2011
```python
dataset = load_dataset('Salesforce/cloudops_tsf', 'borg_cluster_data_2011')
print(dataset)
DatasetDict({
train_test: Dataset({
features: ['start', 'target', 'item_id', 'feat_static_cat', 'past_feat_dynamic_real'],
num_rows: 11117
})
pretrain: Dataset({
features: ['start', 'target', 'item_id', 'feat_static_cat', 'past_feat_dynamic_real'],
num_rows: 143386
})
})
```
### alibaba_cluster_trace_2018
```python
dataset = load_dataset('Salesforce/cloudops_tsf', 'alibaba_cluster_trace_2018')
print(dataset)
DatasetDict({
train_test: Dataset({
features: ['start', 'target', 'item_id', 'feat_static_cat', 'past_feat_dynamic_real'],
num_rows: 6048
})
pretrain: Dataset({
features: ['start', 'target', 'item_id', 'feat_static_cat', 'past_feat_dynamic_real'],
num_rows: 58409
})
})
```
## Dataset Config
```python
from datasets import load_dataset_builder
config = load_dataset_builder('Salesforce/cloudops_tsf', 'azure_vm_traces_2017').config
print(config)
CloudOpsTSFConfig(
name='azure_vm_traces_2017',
version=1.0.0,
data_dir=None,
data_files=None,
description='',
prediction_length=48,
freq='5T',
stride=48,
univariate=True,
multivariate=False,
optional_fields=(
'feat_static_cat',
'feat_static_real',
'past_feat_dynamic_real'
),
rolling_evaluations=12,
test_split_date=Period('2016-12-13 15:55', '5T'),
_feat_static_cat_cardinalities={
'pretrain': (
('vm_id', 177040),
('subscription_id', 5514),
('deployment_id', 15208),
('vm_category', 3)
),
'train_test': (
('vm_id', 17568),
('subscription_id', 2713),
('deployment_id', 3255),
('vm_category', 3)
)
},
target_dim=1,
feat_static_real_dim=3,
past_feat_dynamic_real_dim=2
)
```
```test_split_date``` is provided to achieve the same train-test split as given in the paper.
This is essentially the date/time of ```rolling_evaluations * prediction_length``` time steps before the last time step in the dataset.
Note that the pre-training dataset includes the test region, and thus should also be filtered before usage.
## Acknowledgements
The datasets were processed from the following original sources. Please cite the original sources if you use the datasets.
* Azure VM Traces 2017
* Bianchini. Resource central: Understanding and predicting workloads for improved resource management in large cloud platforms. In Proceedings of the 26th Symposium on Operating Systems Principles, pp. 153–167, 2017.
* https://github.com/Azure/AzurePublicDataset
* Borg Cluster Data 2011
* John Wilkes. More Google cluster data. Google research blog, November 2011. Posted at http://googleresearch.blogspot.com/2011/11/more-google-cluster-data.html.
* https://github.com/google/cluster-data
* Alibaba Cluster Trace 2018
* Jing Guo, Zihao Chang, Sa Wang, Haiyang Ding, Yihui Feng, Liang Mao, and Yungang Bao. Who limits the resource efficiency of my datacenter: An analysis of alibaba datacenter traces. In Proceedings of the International Symposium on Quality of Service, pp. 1–10, 2019.
* https://github.com/alibaba/clusterdata
## Citation
<pre>
@article{woo2023pushing,
title={Pushing the Limits of Pre-training for Time Series Forecasting in the CloudOps Domain},
author={Woo, Gerald and Liu, Chenghao and Kumar, Akshat and Sahoo, Doyen},
journal={arXiv preprint arXiv:2310.05063},
year={2023}
}
</pre>
|
Xiao215/pixiv-image-with-caption | ---
title: "Pixiv Daily Trending Illusions Dataset"
language:
- "en"
license:
- "unknown" # Since the exact terms for redistributed scraped content are unclear
multilinguality:
- "monolingual"
size_categories:
- "100<n<1K" # This dataset contains 949 images
source_datasets:
- "extended|other-Pixiv" # Indicating that this dataset extends from or is based on Pixiv
task_categories:
- "image-to-text"
- "text-to-image"
task_ids:
- "image-captioning"
annotations_creators:
- "machine-generated"
---
# Dataset Card for Pixiv Daily Trending Illusions Dataset
Note, this dataset contains copyright issue, and is displayed for fun personal project only. Do not use it.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Access](#access)
- [Dataset Structure](#dataset-structure)
- [Usage](#usage)
- [Acknowledgements](#acknowledgements)
- [Licensing](#licensing)
## Dataset Description
- **Homepage:** [Pixiv Daily Trending Illusions](https://www.pixiv.net/discovery?mode=safe)
- **Repository:** [HuggingFace Dataset](https://huggingface.co/datasets/Xiao215/pixiv-image-with-caption)
- **Paper:** N/A
- **Leaderboard:** N/A
### Dataset Summary
This dataset comprises 949 images scrapped from Pixiv's daily trend, specifically curated to include only illustrations that are illusions and suitable for all ages. Each image in the dataset is accompanied by a caption generated by the LLaVa model, providing a descriptive or interpretive text element for the visual content.
### Languages
Captions are generated in [Language(s)], as processed by the LLaVa model on HuggingFace.
## Access
The dataset can be accessed through the HuggingFace `datasets` library using the following code snippet:
```python
from datasets import load_dataset
dataset = load_dataset("Xiao215/pixiv-image-with-caption")
```
## Dataset Structure
### Data Instances
A data instance in this dataset comprises the following fields:
- `image_name`: a `string` representing the filename of the image, following the pattern `pixiv{image_id}.png`.
- `caption`: a `string` generated by the LLaVa model, describing or interpreting the image.
Example:
```python
{
"image_name": "pixiv100028371.png",
"caption": "A mesmerizing pattern that appears to swirl endlessly."
}
```
### Data Splits
This dataset is provided in a single split:
- The `all` split contains all 949 images and their corresponding captions.
## Usage
This dataset can be used for tasks such as image captioning, visual understanding, and training models to generate descriptive texts for abstract visual content. Here's an example of how to load and use the dataset:
from datasets import load_dataset
# Example usage
```python
from datasets import load_dataset
dataset = load_dataset("Xiao215/pixiv-image-with-caption")
for sample in dataset['all']:
print(sample['image_name'], sample['caption'])
```
## Usage with cache
```python
from datasets import load_dataset
# Specify the path where you want to cache the dataset
cache_dir = "/path/to/your/desired/cache/directory"
# Load the dataset and specify the cache directory
dataset = load_dataset("Xiao215/pixiv-image-with-caption", cache_dir=cache_dir)
```
## Acknowledgements
This dataset was collected from [Pixiv](https://www.pixiv.net/discovery?mode=safe), with captions generated by the [LLaVa model](https://huggingface.co/docs/transformers/en/model_doc/llava) on HuggingFace.
## Licensing
Please review Pixiv's terms of use and licensing information to ensure compliance when using this dataset. The use of the LLaVa model for generating captions is subject to the terms and conditions provided by HuggingFace and the model's authors.
|
open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1 | ---
pretty_name: Evaluation run of jondurbin/airocoder-34b-2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airocoder-34b-2.1](https://huggingface.co/jondurbin/airocoder-34b-2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T17:47:22.739718](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1/blob/main/results_2023-10-28T17-47-22.739718.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31669463087248323,\n\
\ \"em_stderr\": 0.004763952451764173,\n \"f1\": 0.3681816275167802,\n\
\ \"f1_stderr\": 0.0047033328815527095,\n \"acc\": 0.3913430865625522,\n\
\ \"acc_stderr\": 0.010251830385905714\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31669463087248323,\n \"em_stderr\": 0.004763952451764173,\n\
\ \"f1\": 0.3681816275167802,\n \"f1_stderr\": 0.0047033328815527095\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \
\ \"acc_stderr\": 0.007615650277106696\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.01288801049470473\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airocoder-34b-2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|arc:challenge|25_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T17_47_22.739718
path:
- '**/details_harness|drop|3_2023-10-28T17-47-22.739718.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T17-47-22.739718.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T17_47_22.739718
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-47-22.739718.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-47-22.739718.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hellaswag|10_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T21-47-37.298626.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T21-47-37.298626.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T17_47_22.739718
path:
- '**/details_harness|winogrande|5_2023-10-28T17-47-22.739718.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T17-47-22.739718.parquet'
- config_name: results
data_files:
- split: 2023_09_11T21_47_37.298626
path:
- results_2023-09-11T21-47-37.298626.parquet
- split: 2023_10_28T17_47_22.739718
path:
- results_2023-10-28T17-47-22.739718.parquet
- split: latest
path:
- results_2023-10-28T17-47-22.739718.parquet
---
# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airocoder-34b-2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airocoder-34b-2.1](https://huggingface.co/jondurbin/airocoder-34b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T17:47:22.739718](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1/blob/main/results_2023-10-28T17-47-22.739718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31669463087248323,
"em_stderr": 0.004763952451764173,
"f1": 0.3681816275167802,
"f1_stderr": 0.0047033328815527095,
"acc": 0.3913430865625522,
"acc_stderr": 0.010251830385905714
},
"harness|drop|3": {
"em": 0.31669463087248323,
"em_stderr": 0.004763952451764173,
"f1": 0.3681816275167802,
"f1_stderr": 0.0047033328815527095
},
"harness|gsm8k|5": {
"acc": 0.08339651250947688,
"acc_stderr": 0.007615650277106696
},
"harness|winogrande|5": {
"acc": 0.6992896606156275,
"acc_stderr": 0.01288801049470473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
foji111/zakirkhan | ---
license: afl-3.0
---
|
james-burton/fake_job_postings2_all_text | ---
dataset_info:
features:
- name: title
dtype: string
- name: salary_range
dtype: string
- name: description
dtype: string
- name: required_experience
dtype: string
- name: required_education
dtype: string
- name: fraudulent
dtype: int64
splits:
- name: train
num_bytes: 14698550
num_examples: 10816
- name: validation
num_bytes: 2500568
num_examples: 1909
- name: test
num_bytes: 4379198
num_examples: 3182
download_size: 0
dataset_size: 21578316
---
# Dataset Card for "fake_job_postings2_all_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucapantea/fact-ai | ---
license: mit
---
|
ChanceFocus/flare-es-instruction-tuning | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 41354500
num_examples: 14851
- name: valid
num_bytes: 6718150
num_examples: 2226
download_size: 23259291
dataset_size: 48072650
---
# Dataset Card for "flare-es-instruction-tuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_185 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 742922800
num_examples: 145900
download_size: 756705575
dataset_size: 742922800
---
# Dataset Card for "chunk_185"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hbamoba/openassistant-guanaco-mistral | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15736333
num_examples: 9846
download_size: 9174838
dataset_size: 15736333
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
---
This dataset is a modified version of the openassistant-guanaco dataset [1], which is a subset of the Open Assistant dataset [2]
References
* [1] https://huggingface.co/datasets/timdettmers/openassistant-guanaco
* [2] https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main
* [3] https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
The openassistant-guanaco dataset subset of the data only contains the highest-rated paths in the conversation tree from the Open Assistant dataset, with a total of 9,846 samples.
This dataset is processed to match Mistral-7B-Instruct-v0.1's prompt format as described in [3]
For further information, please see the original dataset.
License: Apache 2.0
---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15736333
num_examples: 9846
download_size: 9174838
dataset_size: 15736333
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
--- |
Asimok/KGLQA-LangChain-RACE | ---
license: apache-2.0
---
|
HusnaManakkot/haispider | ---
license: cc-by-4.0
task_categories:
- text2text-generation
language:
- en
tags:
- text-to-sql
pretty_name: new spider data updated
size_categories:
- 1K<n<10K
---
# Dataset Card for Spider
Table of Contents
Dataset Description
Dataset Summary
Supported Tasks and Leaderboards
Languages
Dataset Structure
Data Instances
Data Fields
Data Splits
Dataset Creation
Curation Rationale
Source Data
Annotations
Personal and Sensitive Information
Considerations for Using the Data
Social Impact of Dataset
Discussion of Biases
Other Known Limitations
Additional Information
Dataset Curators
Licensing Information
Citation Information
Contributions
Dataset Description
Homepage: https://yale-lily.github.io/spider
Repository: https://github.com/taoyds/spider
Paper: https://www.aclweb.org/anthology/D18-1425/
Point of Contact: Yale LILY
Dataset Summary
Spider is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases
Supported Tasks and Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
Languages
The text in the dataset is in English.
Dataset Structure
Data Instances
What do the instances that comprise the dataset represent?
Each instance is natural language question and the equivalent SQL query
How many instances are there in total?
What data does each instance consist of?
[More Information Needed]
Data Fields
db_id: Database name
question: Natural language to interpret into SQL
query: Target SQL query
query_toks: List of tokens for the query
query_toks_no_value: List of tokens for the query
question_toks: List of tokens for the question
Data Splits
train: 7000 questions and SQL query pairs dev: 1034 question and SQL query pairs
[More Information Needed]
Dataset Creation
Curation Rationale
[More Information Needed]
Source Data
Initial Data Collection and Normalization
Who are the source language producers?
[More Information Needed]
Annotations
The dataset was annotated by 11 college students at Yale University
Annotation process
Who are the annotators?
Personal and Sensitive Information
[More Information Needed]
Considerations for Using the Data
Social Impact of Dataset
Discussion of Biases
[More Information Needed]
Other Known Limitations
Additional Information
The listed authors in the homepage are maintaining/supporting the dataset.
Dataset Curators
[More Information Needed]
Licensing Information
The spider dataset is licensed under the CC BY-SA 4.0
[More Information Needed]
Citation Information
@article{yu2018spider,
title={Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task},
author={Yu, Tao and Zhang, Rui and Yang, Kai and Yasunaga, Michihiro and Wang, Dongxu and Li, Zifan and Ma, James and Li, Irene and Yao, Qingning and Roman, Shanelle and others},
journal={arXiv preprint arXiv:1809.08887},
year={2018}
} |
hennyferreira/hennyferreiravoice | ---
license: openrail
---
|
kamilakesbi/ami_spd_augmented_test2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speakers
sequence: string
- name: timestamps_start
sequence: float64
- name: timestamps_end
sequence: float64
splits:
- name: train
num_bytes: 8978099075.0
num_examples: 4120
- name: validation
num_bytes: 978997266.0
num_examples: 459
- name: test
num_bytes: 933737868.0
num_examples: 432
download_size: 10482379450
dataset_size: 10890834209.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.