datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_psmathur__orca_mini_v3_7b | ---
pretty_name: Evaluation run of psmathur/orca_mini_v3_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_v3_7b](https://huggingface.co/psmathur/orca_mini_v3_7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v3_7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T04:27:15.231240](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_7b/blob/main/results_2023-10-18T04-27-15.231240.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08043204697986577,\n\
\ \"em_stderr\": 0.0027851341980506704,\n \"f1\": 0.15059563758389252,\n\
\ \"f1_stderr\": 0.0030534563383277672,\n \"acc\": 0.4069827001752661,\n\
\ \"acc_stderr\": 0.009686225873410097\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08043204697986577,\n \"em_stderr\": 0.0027851341980506704,\n\
\ \"f1\": 0.15059563758389252,\n \"f1_stderr\": 0.0030534563383277672\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865706\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_v3_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|arc:challenge|25_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T04_27_15.231240
path:
- '**/details_harness|drop|3_2023-10-18T04-27-15.231240.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T04-27-15.231240.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T04_27_15.231240
path:
- '**/details_harness|gsm8k|5_2023-10-18T04-27-15.231240.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T04-27-15.231240.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hellaswag|10_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T13:35:32.670682.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T13:35:32.670682.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T04_27_15.231240
path:
- '**/details_harness|winogrande|5_2023-10-18T04-27-15.231240.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T04-27-15.231240.parquet'
- config_name: results
data_files:
- split: 2023_08_16T13_35_32.670682
path:
- results_2023-08-16T13:35:32.670682.parquet
- split: 2023_10_18T04_27_15.231240
path:
- results_2023-10-18T04-27-15.231240.parquet
- split: latest
path:
- results_2023-10-18T04-27-15.231240.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v3_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_7b](https://huggingface.co/psmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v3_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T04:27:15.231240](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_7b/blob/main/results_2023-10-18T04-27-15.231240.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672,
"acc": 0.4069827001752661,
"acc_stderr": 0.009686225873410097
},
"harness|drop|3": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865706
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Back-up/chung-khoan-demo-p3 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 17013286
num_examples: 3274
download_size: 6188480
dataset_size: 17013286
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
balgot/stylegan3-annotated | ---
task_categories:
- image-to-text
- text-to-image
language:
- en
tags:
- stylegan3
- face-generation
size_categories:
- 10K<n<100K
---
# StyleGAN3 Annotated Images
This dataset consists of a `pandas` table and attached `images.zip` file with these entries:
* seed (`numpy` seed used to generate random vectors)
* path (path to the generated image obtained after unzipping `images.zip`)
* vector (generated numpy "random" vector used to create StyleGAN3 images)
* text (caption of each image, generated using BLIP model: `Salesforce/blip-image-captioning-base`)
## Usage
In order not to load the images into the memory, we will load the images separately.
```python
images = load_dataset("balgot/stylegan3-annotated", data_files=["*.zip"])
dataset = load_dataset("balgot/stylegan3-annotated", data_files=["*.csv"])
# TODO: convert "vector" column to numpy/torch
```
It was created as a part of the course project for FI:PA228 at Masaryk University. |
joey234/mmlu-high_school_european_history-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 271443
num_examples: 165
download_size: 141556
dataset_size: 271443
---
# Dataset Card for "mmlu-high_school_european_history-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yorth/dalleTestData | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: resolution
dtype: string
splits:
- name: train
num_bytes: 1065107.5689354276
num_examples: 4
download_size: 633186
dataset_size: 1065107.5689354276
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Harabojae/eajpark1_ | ---
license: openrail
---
|
open-llm-leaderboard/details_Weyaxi__a | ---
pretty_name: Evaluation run of Weyaxi/a
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/a](https://huggingface.co/Weyaxi/a) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__a\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T02:21:43.730137](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__a/blob/main/results_2024-02-10T02-21-43.730137.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5646393625024938,\n\
\ \"acc_stderr\": 0.0332095796161226,\n \"acc_norm\": 0.5738208212143998,\n\
\ \"acc_norm_stderr\": 0.03412746423842213,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4454908751064698,\n\
\ \"mc2_stderr\": 0.014845534784458896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5955631399317406,\n \"acc_stderr\": 0.014342036483436177,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6676956781517626,\n\
\ \"acc_stderr\": 0.004700767741735561,\n \"acc_norm\": 0.8648675562636925,\n\
\ \"acc_norm_stderr\": 0.003411663071651109\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342592,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342592\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.03981240543717861,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.03981240543717861\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594525,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.03210479051015776,\n \
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n\
\ \"acc_stderr\": 0.017871217767790236,\n \"acc_norm\": 0.7761467889908257,\n\
\ \"acc_norm_stderr\": 0.017871217767790236\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n\
\ \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929189,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929189\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988836,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988836\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424523,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424523\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364803,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364803\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067361,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067361\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125478,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125478\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.038922121953330446,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.038922121953330446\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4454908751064698,\n\
\ \"mc2_stderr\": 0.014845534784458896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/a
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-21-43.730137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-21-43.730137.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- '**/details_harness|winogrande|5_2024-02-10T02-21-43.730137.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T02-21-43.730137.parquet'
- config_name: results
data_files:
- split: 2024_02_10T02_21_43.730137
path:
- results_2024-02-10T02-21-43.730137.parquet
- split: latest
path:
- results_2024-02-10T02-21-43.730137.parquet
---
# Dataset Card for Evaluation run of Weyaxi/a
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/a](https://huggingface.co/Weyaxi/a) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__a",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T02:21:43.730137](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__a/blob/main/results_2024-02-10T02-21-43.730137.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5646393625024938,
"acc_stderr": 0.0332095796161226,
"acc_norm": 0.5738208212143998,
"acc_norm_stderr": 0.03412746423842213,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.4454908751064698,
"mc2_stderr": 0.014845534784458896
},
"harness|arc:challenge|25": {
"acc": 0.5955631399317406,
"acc_stderr": 0.014342036483436177,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6676956781517626,
"acc_stderr": 0.004700767741735561,
"acc_norm": 0.8648675562636925,
"acc_norm_stderr": 0.003411663071651109
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342592,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342592
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.03981240543717861,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.03981240543717861
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594525,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790236,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790236
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929189,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929189
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988836,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424523,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424523
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364803,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067361,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067361
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125478,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125478
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.038922121953330446,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.038922121953330446
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.4454908751064698,
"mc2_stderr": 0.014845534784458896
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sam-mosaic/iv4-chatml-4k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: source
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2099655591.2305844
num_examples: 346151
- name: test
num_bytes: 315348071.4406665
num_examples: 51577
download_size: 295209643
dataset_size: 2415003662.671251
---
# Dataset Card for "iv4-chatml-4k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MTruc/vit5_1 | ---
license: apache-2.0
---
|
eunbinni/ola_polyglot_1.3B_t2_data | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 91136318
num_examples: 22214
download_size: 47121283
dataset_size: 91136318
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ola_polyglot_1.3B_t2_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seungchanlee/test-1 | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KirbyShrine/wally_bagbean | ---
license: cc-by-nc-nd-4.0
---
|
CyberHarem/chicago_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chicago/シカゴ/芝加哥 (Azur Lane)
This is the dataset of chicago/シカゴ/芝加哥 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `breasts, drill_hair, blonde_hair, ahoge, blue_eyes, large_breasts, twin_drills, hair_between_eyes, long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 21.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 14.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 29.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 19.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 38.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chicago_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, smile, blush, cleavage, bare_shoulders, looking_at_viewer, navel, solo, black_choker, red_gloves, star_print, collarbone, midriff, elbow_gloves, criss-cross_halter, short_shorts, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | blush | cleavage | bare_shoulders | looking_at_viewer | navel | solo | black_choker | red_gloves | star_print | collarbone | midriff | elbow_gloves | criss-cross_halter | short_shorts | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-----------|:-----------------|:--------------------|:--------|:-------|:---------------|:-------------|:-------------|:-------------|:----------|:---------------|:---------------------|:---------------|:----------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2 | ---
pretty_name: Evaluation run of Locutusque/TinyMistral-248M-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/TinyMistral-248M-v2](https://huggingface.co/Locutusque/TinyMistral-248M-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-07T20:59:32.750418](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2/blob/main/results_2024-01-07T20-59-32.750418.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23444137768381748,\n\
\ \"acc_stderr\": 0.030036991331001676,\n \"acc_norm\": 0.23411204926810086,\n\
\ \"acc_norm_stderr\": 0.030827397503827552,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766365,\n \"mc2\": 0.49601663232017196,\n\
\ \"mc2_stderr\": 0.01564731250181349\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18600682593856654,\n \"acc_stderr\": 0.011370940183266738,\n\
\ \"acc_norm\": 0.21245733788395904,\n \"acc_norm_stderr\": 0.01195348290658295\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26180043815972914,\n\
\ \"acc_stderr\": 0.004387161203087972,\n \"acc_norm\": 0.26558454491137223,\n\
\ \"acc_norm_stderr\": 0.004407413723383408\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.024959918028911274,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.024959918028911274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198816,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198816\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21428571428571427,\n \"acc_stderr\": 0.021132859182754454,\n \"\
acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.021132859182754454\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.19032258064516128,\n\
\ \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.19032258064516128,\n\
\ \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.16748768472906403,\n \"acc_stderr\": 0.026273086047535428,\n\
\ \"acc_norm\": 0.16748768472906403,\n \"acc_norm_stderr\": 0.026273086047535428\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117447,\n\
\ \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.02093244577446318,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.02093244577446318\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \
\ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.017208579357787565,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.017208579357787565\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824688,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824688\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955917,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955917\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341016,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341016\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.02253500670594282,\n\
\ \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.02253500670594282\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225395,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225395\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766365,\n \"mc2\": 0.49601663232017196,\n\
\ \"mc2_stderr\": 0.01564731250181349\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5185477505919495,\n \"acc_stderr\": 0.014042813708888378\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/TinyMistral-248M-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|arc:challenge|25_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|gsm8k|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hellaswag|10_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T20-59-32.750418.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- '**/details_harness|winogrande|5_2024-01-07T20-59-32.750418.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-07T20-59-32.750418.parquet'
- config_name: results
data_files:
- split: 2024_01_07T20_59_32.750418
path:
- results_2024-01-07T20-59-32.750418.parquet
- split: latest
path:
- results_2024-01-07T20-59-32.750418.parquet
---
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2](https://huggingface.co/Locutusque/TinyMistral-248M-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T20:59:32.750418](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2/blob/main/results_2024-01-07T20-59-32.750418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23444137768381748,
"acc_stderr": 0.030036991331001676,
"acc_norm": 0.23411204926810086,
"acc_norm_stderr": 0.030827397503827552,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766365,
"mc2": 0.49601663232017196,
"mc2_stderr": 0.01564731250181349
},
"harness|arc:challenge|25": {
"acc": 0.18600682593856654,
"acc_stderr": 0.011370940183266738,
"acc_norm": 0.21245733788395904,
"acc_norm_stderr": 0.01195348290658295
},
"harness|hellaswag|10": {
"acc": 0.26180043815972914,
"acc_stderr": 0.004387161203087972,
"acc_norm": 0.26558454491137223,
"acc_norm_stderr": 0.004407413723383408
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198816,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198816
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.021132859182754454,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.021132859182754454
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.16748768472906403,
"acc_stderr": 0.026273086047535428,
"acc_norm": 0.16748768472906403,
"acc_norm_stderr": 0.026273086047535428
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117447,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02093244577446318,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02093244577446318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.017208579357787565,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.017208579357787565
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824688,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824688
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955917,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955917
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341016,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341016
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.02253500670594282,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.02253500670594282
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766365,
"mc2": 0.49601663232017196,
"mc2_stderr": 0.01564731250181349
},
"harness|winogrande|5": {
"acc": 0.5185477505919495,
"acc_stderr": 0.014042813708888378
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
0x7o/fialka-v3-data | ---
dataset_info:
features:
- name: texts
dtype: string
splits:
- name: train
num_bytes: 225223909.0
num_examples: 130958
download_size: 105289541
dataset_size: 225223909.0
---
# Dataset Card for "fialka-v3-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
weijie210/gsm8k_decomposed | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: sub_questions
sequence: string
- name: sub_answers
sequence: string
splits:
- name: train
num_bytes: 7957698.071086808
num_examples: 7115
- name: test
num_bytes: 223687.92891319207
num_examples: 200
download_size: 4453334
dataset_size: 8181386.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
distilabel-internal-testing/deita-colab | ---
size_categories: n<1K
config_names:
- deita_filtering
tags:
- synthetic
- distilabel
- rlaif
---
<p align="left">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# Dataset Card for deita-colab
This dataset has been created with [Distilabel](https://distilabel.argilla.io/).
## Dataset Summary
This dataset contains a `pipeline.yaml` which can be used to reproduce the pipeline that generated it in distilabel using the `distilabel` CLI:
```console
distilabel pipeline run --config "https://huggingface.co/datasets/distilabel-internal-testing/deita-colab/raw/main/pipeline.yaml"
```
or explore the configuration:
```console
distilabel pipeline info --config "https://huggingface.co/datasets/distilabel-internal-testing/deita-colab/raw/main/pipeline.yaml"
```
## Dataset structure
The examples have the following structure per configuration:
<details><summary> Configuration: deita_filtering </summary><hr>
```json
{
"answer": "To enhance the existing Python script, we can follow the steps mentioned below:\n\n1. Identify and improve performance bottlenecks:\n- Use profiling tools like cProfile to identify the parts of the script that are taking the most time to execute.\n- Once identified, consider optimizing the code by using more efficient data structures, algorithms, or techniques.\n- Utilize libraries like NumPy for numerical computations or Pandas for data manipulation, which can significantly improve performance.\n\n2. Incorporate comprehensive exception handling:\n- Add try-except blocks to handle exceptions gracefully and provide meaningful error messages to the user.\n- Consider using specific exception handling to catch and handle different types of errors appropriately.\n- Use logging to capture and report errors for debugging purposes.\n\n3. Refine algorithmic complexities:\n- Analyze the existing algorithms used in the script and identify opportunities for optimization.\n- Consider using more efficient algorithms or data structures to reduce time complexity and improve overall performance.\n- Break down complex algorithms into smaller, more manageable functions for better readability and maintainability.\n\nBy following these steps, we can enhance the existing Python script by improving its performance, incorporating comprehensive exception handling, and refining algorithmic complexities.",
"completion": "The original code counts the number of lowercase vowels in a string, but the count should probably be case-insensitive. Here is a case-insensitive version.\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i].toLowerCase())) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"conversation": [
{
"content": "Enhance the existing Python script by identifying and improving performance bottlenecks, incorporating comprehensive exception handling, and refining algorithmic complexities.",
"role": "user"
},
{
"content": "To enhance the existing Python script, we can follow the steps mentioned below:\n\n1. Identify and improve performance bottlenecks:\n- Use profiling tools like cProfile to identify the parts of the script that are taking the most time to execute.\n- Once identified, consider optimizing the code by using more efficient data structures, algorithms, or techniques.\n- Utilize libraries like NumPy for numerical computations or Pandas for data manipulation, which can significantly improve performance.\n\n2. Incorporate comprehensive exception handling:\n- Add try-except blocks to handle exceptions gracefully and provide meaningful error messages to the user.\n- Consider using specific exception handling to catch and handle different types of errors appropriately.\n- Use logging to capture and report errors for debugging purposes.\n\n3. Refine algorithmic complexities:\n- Analyze the existing algorithms used in the script and identify opportunities for optimization.\n- Consider using more efficient algorithms or data structures to reduce time complexity and improve overall performance.\n- Break down complex algorithms into smaller, more manageable functions for better readability and maintainability.\n\nBy following these steps, we can enhance the existing Python script by improving its performance, incorporating comprehensive exception handling, and refining algorithmic complexities.",
"role": "assistant"
}
],
"deita_score": 49.0,
"deita_score_computed_with": [
"evol_instruction_score",
"evol_response_score"
],
"embedding": [
1.9404296875,
-0.5810546875,
-0.353271484375,
-0.583984375,
-0.70166015625,
-1.3662109375,
1.4228515625,
-1.9580078125,
0.82373046875,
0.85888671875,
2.052734375,
2.3515625,
0.07666015625,
2.123046875,
-0.8603515625,
-1.31640625,
-0.60302734375,
2.017578125,
2.4921875,
-0.007022857666015625,
0.152099609375,
-0.297119140625,
-3.37109375,
-1.5126953125,
1.369140625,
-0.92041015625,
-2.818359375,
2.052734375,
-1.75390625,
-0.92041015625,
0.1607666015625,
-1.919921875,
-0.11322021484375,
-1.7890625,
0.986328125,
0.61181640625,
-2.34375,
1.478515625,
-2.24609375,
-3.359375,
1.87109375,
-0.498291015625,
0.8603515625,
-2.833984375,
-2.17578125,
0.5419921875,
1.8916015625,
-0.705078125,
1.1044921875,
1.4619140625,
-2.486328125,
-0.255126953125,
-2.259765625,
4.890625,
3.80859375,
0.56982421875,
0.96728515625,
1.306640625,
-0.086181640625,
-1.9580078125,
-6.6640625,
-1.3369140625,
1.4453125,
-0.751953125,
-0.9580078125,
4.4609375,
-0.16796875,
2.0625,
2.072265625,
1.1123046875,
0.302734375,
-1.767578125,
-0.49072265625,
1.591796875,
1.1728515625,
0.1300048828125,
-2.046875,
-2.732421875,
-1.0087890625,
-2.744140625,
1.705078125,
3.912109375,
-0.00830078125,
1.8046875,
-1.775390625,
0.91650390625,
-3.361328125,
0.748046875,
2.302734375,
-0.05987548828125,
-0.1666259765625,
-1.0341796875,
-0.364013671875,
1.36328125,
1.3349609375,
-0.2666015625,
0.916015625,
3.5234375,
-5.11328125,
-1.08984375,
0.1790771484375,
-2.76171875,
1.9931640625,
-1.0361328125,
0.35107421875,
-1.0498046875,
-1.986328125,
0.4267578125,
0.1044921875,
-2.044921875,
-1.0673828125,
-2.322265625,
0.7314453125,
-0.38037109375,
-0.110107421875,
1.5498046875,
-0.63671875,
0.72265625,
1.59375,
1.08984375,
1.9013671875,
0.015350341796875,
-1.8447265625,
1.8154296875,
0.3994140625,
0.64599609375,
-2.125,
1.142578125,
0.237548828125,
-1.84375,
-4.52734375,
1.12109375,
-2.203125,
-1.4931640625,
2.357421875,
-0.3798828125,
3.2109375,
-2.5703125,
0.0274658203125,
-1.060546875,
0.74853515625,
-2.099609375,
0.313232421875,
-0.921875,
2.08203125,
3.4765625,
-0.7041015625,
2.708984375,
-1.60546875,
0.179931640625,
3.9375,
0.7529296875,
0.1767578125,
1.3212890625,
2.345703125,
1.548828125,
-1.9990234375,
0.8681640625,
-9.9921875,
2.5,
0.1357421875,
-0.548828125,
-0.63037109375,
0.12255859375,
2.45703125,
-0.1949462890625,
-1.884765625,
-1.9921875,
2.1484375,
3.19921875,
1.681640625,
-0.0562744140625,
0.4736328125,
1.5673828125,
-2.37109375,
-2.5078125,
1.185546875,
-0.96044921875,
-2.0,
-2.279296875,
-2.779296875,
-1.0634765625,
1.65234375,
1.2685546875,
-1.3759765625,
3.06640625,
-0.400390625,
-0.70947265625,
-0.505859375,
1.904296875,
1.3193359375,
1.63671875,
-2.59765625,
-0.56787109375,
-1.28515625,
0.2398681640625,
1.884765625,
-2.421875,
1.8037109375,
2.580078125,
-1.751953125,
1.55859375,
-0.92626953125,
2.884765625,
1.626953125,
-1.8837890625,
0.75537109375,
-0.00331878662109375,
0.0017271041870117188,
-0.375732421875,
1.0380859375,
1.5361328125,
1.5107421875,
-0.75830078125,
2.3515625,
1.04296875,
-1.15625,
-3.265625,
-0.568359375,
-2.404296875,
1.224609375,
-1.9384765625,
0.68994140625,
0.71435546875,
1.2421875,
1.3681640625,
-0.27001953125,
1.67578125,
0.438720703125,
0.57080078125,
0.48779296875,
1.0263671875,
0.031280517578125,
1.064453125,
-11.8359375,
1.0634765625,
-3.607421875,
-0.10711669921875,
0.46533203125,
0.55078125,
1.994140625,
-0.260009765625,
-0.7255859375,
4.54296875,
-0.7626953125,
0.62255859375,
0.6455078125,
-0.49951171875,
0.65771484375,
1.17578125,
-0.181396484375,
-1.177734375,
1.9482421875,
-0.765625,
0.95556640625,
0.364013671875,
1.2607421875,
2.734375,
-1.3125,
-0.556640625,
-1.8720703125,
-0.2069091796875,
1.1025390625,
3.17578125,
-2.6875,
-0.00762176513671875,
0.09521484375,
0.6240234375,
1.0,
1.8505859375,
0.11871337890625,
4.671875,
-0.6201171875,
0.6650390625,
-0.65234375,
-2.697265625,
-1.84375,
2.861328125,
1.4072265625,
1.2685546875,
0.1640625,
1.43359375,
-9.046875,
-0.28759765625,
0.89892578125,
0.9541015625,
1.9892578125,
0.8876953125,
-0.42333984375,
-0.63916015625,
0.220947265625,
2.28125,
-0.401611328125,
2.578125,
0.63232421875,
0.0755615234375,
2.375,
0.76806640625,
-0.363037109375,
-1.751953125,
-2.888671875,
-0.487060546875,
0.07366943359375,
-0.677734375,
0.65185546875,
0.54150390625,
0.59619140625,
-1.3330078125,
0.63916015625,
-0.1605224609375,
0.320068359375,
0.495361328125,
-0.132568359375,
-1.30859375,
1.740234375,
0.513671875,
1.818359375,
3.294921875,
0.125732421875,
-2.359375,
-1.048828125,
-1.5107421875,
1.29296875,
0.78662109375,
2.04296875,
1.1669921875,
1.9013671875,
-0.348876953125,
-0.4560546875,
-1.8046875,
-0.9716796875,
2.517578125,
2.337890625,
1.064453125,
2.009765625,
-2.880859375,
0.08856201171875,
0.16162109375,
0.10711669921875,
1.4033203125,
-1.3173828125,
3.494140625,
0.650390625,
1.7353515625,
1.171875,
1.21875,
0.98291015625,
-0.80224609375,
-1.7861328125,
-3.4921875,
0.74658203125,
-1.0576171875,
-0.355224609375,
-2.712890625,
-0.7421875,
-1.99609375,
1.7255859375,
-4.0234375,
0.237548828125,
-0.2283935546875,
1.615234375,
-2.009765625,
-1.5869140625,
0.114013671875,
-1.0478515625,
0.81591796875,
0.74072265625,
-0.0081634521484375,
-0.46484375,
0.76220703125,
0.38720703125,
-1.232421875,
0.5888671875,
1.7890625,
5.71875,
1.5625,
1.294921875,
-0.89501953125,
0.04705810546875,
-1.2216796875,
-1.4560546875,
-1.51953125,
2.0859375,
-0.3291015625,
-1.109375,
-2.1640625,
3.96484375,
7.5078125,
0.69384765625,
1.509765625,
1.193359375,
2.318359375,
-1.1083984375,
1.0009765625,
1.404296875,
0.330810546875,
-1.6201171875,
-1.0625,
0.471923828125,
1.28515625,
0.1964111328125,
2.033203125,
1.5029296875,
1.7919921875,
-0.89697265625,
0.92138671875,
-0.098876953125,
-2.6640625,
3.39453125,
0.005352020263671875,
1.8779296875,
0.2276611328125,
-0.67919921875,
0.79541015625,
-1.43359375,
-2.298828125,
0.75927734375,
2.16015625,
1.0419921875,
-1.505859375,
0.95361328125,
-1.4306640625,
1.2724609375,
-0.814453125,
-1.9443359375,
0.88720703125,
-0.464111328125,
0.18798828125,
1.84765625,
-1.2978515625,
-1.416015625,
1.0859375,
-2.10546875,
0.54345703125,
1.7197265625,
-0.385986328125,
1.1376953125,
-0.08123779296875,
-3.708984375,
1.326171875,
-0.41845703125,
2.962890625,
3.392578125,
-0.98193359375,
2.6796875,
-1.2646484375,
-1.6845703125,
-0.37353515625,
-5.05859375,
0.448974609375,
0.55859375,
1.998046875,
-0.418212890625,
-3.123046875,
-1.501953125,
-1.8623046875,
-0.0240325927734375,
3.328125,
-1.5205078125,
0.91015625,
2.263671875,
0.161376953125,
1.52734375,
1.275390625,
2.279296875,
-0.282958984375,
-1.59765625,
-1.4521484375,
-2.244140625,
0.84716796875,
1.515625,
-0.465087890625,
0.1060791015625,
-1.22265625,
-2.01953125,
1.3017578125,
-1.59765625,
2.798828125,
-0.5576171875,
1.3603515625,
1.3974609375,
0.7099609375,
1.892578125,
2.158203125,
1.26171875,
-0.0504150390625,
0.5283203125,
3.005859375,
0.57080078125,
-0.97509765625,
0.7255859375,
1.6181640625,
2.333984375,
1.7646484375,
4.25390625,
-0.88623046875,
0.87548828125,
1.5625,
1.1650390625,
1.732421875,
1.619140625,
-0.162841796875,
-0.60546875,
-2.0078125,
-1.966796875,
0.5322265625,
-1.5283203125,
1.6298828125,
0.5849609375,
1.8037109375,
3.072265625,
1.7177734375,
-1.37890625,
1.873046875,
0.033660888671875,
-4.8984375,
-0.94775390625,
-2.08984375,
1.3134765625,
0.1512451171875,
2.84375,
2.7734375,
-2.76953125,
-0.76611328125,
1.1064453125,
-2.58203125,
-2.1796875,
-0.87060546875,
2.341796875,
-2.833984375,
-3.2734375,
0.9384765625,
1.65625,
-2.025390625,
-0.2041015625,
0.03497314453125,
-3.1171875,
-3.984375,
-0.88232421875,
2.25,
-2.65234375,
2.984375,
0.131103515625,
2.48046875,
1.943359375,
-1.53125,
1.9013671875,
1.72265625,
0.6123046875,
-0.0740966796875,
-0.1707763671875,
2.806640625,
-0.131591796875,
1.9296875,
1.869140625,
-2.916015625,
-1.44921875,
-2.703125,
-1.7509765625,
2.994140625,
-0.435302734375,
0.7001953125,
-0.6962890625,
-0.94189453125,
2.044921875,
-2.25390625,
0.95068359375,
2.21875,
-3.0859375,
-0.034027099609375,
2.09765625,
-1.271484375,
-4.2734375,
1.595703125,
2.19140625,
3.220703125,
-0.95458984375,
-2.087890625,
3.330078125,
-0.779296875,
-3.833984375,
-2.546875,
-1.689453125,
-0.5791015625,
0.83935546875,
2.361328125,
-0.2880859375,
-0.074462890625,
-2.4140625,
0.05438232421875,
0.34375,
-0.77734375,
0.1845703125,
0.2724609375,
-1.3505859375,
1.3388671875,
-0.25048828125,
1.361328125,
0.346923828125,
-1.73828125,
-0.82763671875,
-1.6982421875,
-0.317138671875,
2.28515625,
-2.6328125,
0.669921875,
-2.08203125,
-0.403076171875,
0.319091796875,
-1.91796875,
-2.455078125,
1.7578125,
1.416015625,
-0.267578125,
-0.1756591796875,
-0.76708984375,
1.8125,
3.283203125,
-2.12109375,
1.41796875,
0.182861328125,
-4.98046875,
2.98828125,
-3.91015625,
0.765625,
2.599609375,
-0.471435546875,
-0.578125,
-0.64208984375,
3.041015625,
1.5615234375,
1.9423828125,
-0.5673828125,
3.005859375,
-0.0701904296875,
-1.47265625,
-0.84423828125,
-3.443359375,
-0.84130859375,
2.375,
2.685546875,
-0.93896484375,
3.94140625,
1.2626953125,
-1.03125,
-1.0556640625,
0.705078125,
0.12164306640625,
-0.412353515625,
3.107421875,
-2.017578125,
-2.08203125,
0.439697265625,
-0.023529052734375,
1.177734375,
-0.85302734375,
0.125732421875,
0.1329345703125,
-1.70703125,
-0.6474609375,
2.00390625,
0.67138671875,
0.57763671875,
2.3203125,
-3.65625,
-0.2359619140625,
-0.26611328125,
1.408203125,
1.4306640625,
-0.323974609375,
-0.1478271484375,
0.06390380859375,
-1.673828125,
2.4296875,
4.8203125,
-0.057952880859375,
-0.301513671875,
0.86279296875,
2.283203125,
0.047393798828125,
-0.30908203125,
0.30078125,
-0.277099609375,
-4.828125,
1.263671875,
4.640625,
-2.900390625,
-2.697265625,
3.095703125,
2.71875,
-2.724609375,
-2.185546875,
0.232421875,
-3.45703125,
1.7724609375,
2.201171875,
-1.37109375,
0.009796142578125,
-0.1383056640625,
-0.330810546875,
-0.763671875,
1.794921875,
1.9560546875,
-1.482421875,
-1.08203125,
1.2431640625,
-2.140625,
-0.9580078125,
0.61083984375,
-0.54248046875,
-1.1162109375,
0.69921875,
-0.544921875,
1.2451171875,
2.326171875,
1.876953125,
0.5654296875,
-4.55078125,
0.178955078125,
-2.205078125,
-0.1549072265625,
-2.119140625,
-2.677734375,
-1.236328125,
0.03387451171875,
0.591796875,
0.853515625,
3.673828125,
-10.34375,
-3.9921875,
-0.98974609375,
0.7392578125,
4.375,
-0.60400390625,
-0.19189453125,
-1.0654296875,
3.3828125,
-1.185546875,
0.290771484375,
0.775390625,
3.39453125,
-1.916015625,
-1.796875,
1.041015625,
1.267578125,
1.236328125,
-0.5302734375,
1.701171875,
-2.529296875,
-0.056976318359375,
2.751953125,
-0.94677734375,
-0.1851806640625,
-0.199462890625,
0.0260009765625,
0.10797119140625,
1.2783203125,
-2.16015625,
-1.068359375,
1.9453125,
0.006610870361328125,
-4.68359375,
-7.25,
0.330078125,
3.537109375,
-0.25927734375,
-0.7236328125,
-0.1639404296875,
0.6796875,
-1.0380859375,
-0.54296875,
-0.5615234375,
3.072265625,
-0.83544921875,
0.179931640625,
-1.6396484375,
-3.611328125,
-1.740234375,
0.33984375,
3.615234375,
1.599609375,
-3.068359375,
-1.08984375,
-0.4404296875,
0.6103515625,
1.2353515625,
2.15234375,
-1.4052734375,
0.405029296875,
-2.205078125,
1.19921875,
0.454833984375,
-0.50439453125,
1.072265625,
3.134765625,
-1.39453125,
-2.521484375,
-1.8056640625,
0.984375,
-1.3740234375,
0.2344970703125,
2.869140625,
1.029296875,
0.0496826171875,
-1.1064453125,
2.296875,
1.6787109375,
2.673828125,
2.060546875,
0.479248046875,
2.296875,
0.44677734375,
3.974609375,
-2.1953125,
2.12890625,
-2.453125,
-0.78076171875,
0.60302734375,
-2.12890625,
-2.80859375,
-0.69580078125,
1.3525390625,
-1.05859375,
0.009796142578125,
-0.75830078125,
-1.08984375,
-1.80859375,
2.201171875,
-0.296630859375,
2.12890625,
1.8525390625,
-2.708984375,
-1.85546875,
0.66845703125,
1.0966796875,
-2.380859375,
0.9169921875,
1.14453125,
1.7939453125,
2.830078125,
-1.73828125,
-0.60205078125,
-1.4404296875,
-3.408203125,
1.4150390625,
2.119140625,
-1.77734375,
-0.7890625,
0.9931640625,
-1.0390625,
0.1417236328125,
-2.55859375,
-1.2744140625,
1.3212890625,
0.56689453125,
1.0302734375,
2.060546875,
1.140625,
0.401123046875,
-0.25,
-1.0185546875,
-1.6513671875,
-0.552734375,
0.44189453125,
-0.5927734375,
-2.294921875,
2.810546875,
1.177734375,
0.2493896484375,
0.466796875,
1.2744140625,
1.5439453125,
-0.333984375,
-4.67578125,
-1.3916015625,
0.07452392578125,
-0.2001953125,
1.04296875,
0.2374267578125,
0.4560546875,
1.0771484375,
-3.578125,
-2.240234375,
0.74853515625,
-0.54296875,
-0.9853515625,
-0.53515625,
-0.6494140625,
-0.93896484375,
0.8134765625,
-1.107421875,
-0.385986328125,
0.6572265625,
0.498291015625,
0.84033203125,
2.072265625,
2.732421875,
0.427001953125,
0.2213134765625,
0.5673828125,
-0.82080078125,
-0.0484619140625,
-0.2142333984375,
-3.203125,
-1.1943359375,
0.97412109375,
0.810546875,
0.22314453125,
-1.80078125,
2.224609375,
0.55029296875,
0.1343994140625,
-3.337890625,
0.354736328125,
1.40625,
-3.111328125,
0.336669921875,
-0.323486328125,
0.05572509765625,
2.615234375,
-0.73046875,
2.416015625,
-0.1693115234375,
-1.8388671875,
-3.2109375,
0.84912109375,
-1.47265625,
-1.390625,
-1.517578125,
-3.90234375,
-3.6953125,
-0.40283203125,
3.55078125,
0.677734375,
-0.109619140625,
-0.1444091796875,
3.837890625,
0.6689453125,
-0.1016845703125,
0.3544921875,
2.544921875,
0.288818359375,
1.5986328125,
0.9755859375,
1.8583984375,
-0.049102783203125,
1.4990234375,
-0.06646728515625,
-2.03515625,
-1.185546875,
0.66943359375,
-2.068359375,
0.48046875,
-4.95703125,
-0.86767578125,
-1.794921875,
0.81640625,
-1.5068359375,
-0.80615234375,
-1.072265625,
-0.197509765625,
3.2109375,
-1.388671875,
-4.5078125,
2.40625,
0.489501953125,
1.9375,
-2.974609375,
1.62109375,
1.1025390625,
-0.66943359375,
-1.7255859375,
-0.267822265625,
3.75390625,
-0.1768798828125,
0.8203125,
-0.417236328125,
1.861328125,
-2.685546875,
2.0546875,
-0.9208984375,
-2.748046875,
0.06689453125,
-2.052734375,
1.8154296875,
1.751953125,
0.342041015625,
0.591796875,
-0.89794921875,
0.60986328125,
0.26220703125,
0.2998046875,
-0.37548828125,
-1.3798828125,
1.658203125,
1.6494140625,
1.3330078125,
-3.5,
0.153076171875,
-2.212890625,
1.544921875,
2.74609375,
0.1263427734375,
-0.26513671875,
0.366943359375,
-2.67578125,
-0.1578369140625,
-0.8017578125,
0.47119140625,
-0.923828125,
-1.2197265625,
0.56201171875,
3.615234375,
3.125,
-0.46240234375,
-0.7158203125,
-2.8359375,
-2.501953125,
-0.5478515625,
2.30859375,
-0.51416015625,
1.56640625,
1.923828125,
2.890625,
-0.2342529296875,
-2.716796875,
0.211669921875,
-0.77587890625,
1.1455078125,
-0.01788330078125,
-1.51171875,
-0.248046875,
-1.4072265625,
1.03125,
-3.234375,
-0.24658203125,
-0.10333251953125,
2.734375,
-0.9814453125,
0.77978515625,
-3.201171875,
-0.291015625,
-0.01148223876953125,
0.051239013671875,
-0.0036373138427734375,
2.5390625,
-3.994140625,
1.3720703125,
3.44921875,
-1.4873046875,
-2.544921875,
-0.931640625,
-2.330078125,
-3.685546875,
-0.489013671875,
2.6953125,
3.9921875,
1.8154296875,
-0.6982421875,
3.203125,
-0.320556640625,
-2.12890625,
-1.890625,
4.5859375,
-0.974609375,
0.6728515625,
3.625,
-0.13427734375,
1.13671875,
1.0712890625,
0.2452392578125,
1.1044921875,
0.94287109375,
-0.250244140625,
-0.92236328125,
1.970703125,
2.23828125,
-2.28515625,
-1.44921875,
1.9326171875,
1.6162109375,
-2.580078125,
2.80859375,
0.8876953125,
-0.12347412109375,
-1.2041015625,
1.3095703125,
1.8984375,
-0.49462890625,
0.302978515625,
3.20703125,
0.97705078125,
-0.6259765625,
-2.953125,
-1.5185546875,
-1.048828125,
-0.50634765625,
-0.69677734375,
-0.0732421875,
0.84423828125,
-1.8623046875,
-0.156494140625,
0.62841796875,
1.12890625,
0.253662109375,
0.1568603515625,
-1.5009765625,
-1.6240234375,
0.82958984375,
0.33740234375,
0.642578125,
-0.72021484375,
2.212890625,
1.3173828125,
0.65673828125,
-1.3115234375,
-0.31005859375,
0.6259765625,
-2.712890625,
2.0078125,
-1.185546875,
-0.95361328125,
0.42529296875,
-0.66064453125,
0.049560546875,
2.763671875,
-2.814453125,
-0.58837890625,
1.3388671875,
2.14453125,
3.814453125,
-0.036529541015625,
0.99560546875,
-1.8681640625,
2.064453125,
3.853515625,
0.638671875,
-3.279296875,
0.99853515625,
-1.806640625,
-0.6416015625,
-1.8701171875,
0.1728515625,
-0.78662109375,
-0.740234375,
-1.11328125,
0.79345703125,
-0.1893310546875,
-0.68212890625,
-2.5859375,
0.216552734375,
2.697265625,
2.3203125,
-0.6494140625,
0.68896484375,
-2.4140625,
1.8115234375,
0.27734375,
0.72509765625,
1.2587890625,
-3.5078125,
-0.4345703125,
1.00390625,
0.1363525390625,
2.701171875,
2.43359375,
-1.041015625,
-0.271240234375,
-1.8857421875,
-0.9951171875,
1.2802734375,
0.77734375,
-0.0682373046875,
-1.5849609375,
1.515625,
1.6455078125,
-2.478515625,
-1.654296875,
0.78466796875,
-1.65234375,
-1.251953125,
-0.52490234375,
-0.79345703125,
1.3037109375,
0.39892578125,
1.662109375,
0.02520751953125,
0.4638671875,
1.34765625,
0.264404296875,
-1.12109375,
0.97900390625,
0.626953125,
2.052734375,
-1.0888671875,
0.38232421875,
1.6474609375,
-1.49609375,
-1.3037109375,
-0.90283203125,
-0.6494140625,
-1.2275390625,
0.0133819580078125,
2.40234375,
0.00042510032653808594,
1.623046875,
1.662109375,
-1.09375,
1.1337890625,
-0.2353515625,
1.580078125,
-2.26953125,
1.7353515625,
1.5615234375,
1.6064453125,
0.3681640625,
0.43115234375,
-1.1318359375,
-2.890625,
0.04400634765625,
0.41845703125,
0.32080078125,
0.658203125,
-2.287109375,
0.55908203125,
1.0634765625,
-0.1236572265625,
-1.3779296875,
-2.017578125,
-1.833984375,
0.0343017578125,
2.142578125,
0.1461181640625,
-2.064453125,
-4.23828125,
0.517578125,
-0.61083984375,
1.7197265625,
1.572265625,
0.1539306640625,
-0.53564453125,
1.0322265625,
-3.087890625,
2.84375,
-2.384765625,
-0.458251953125,
0.134765625,
-2.119140625,
-1.8583984375,
0.43994140625,
0.378173828125,
-4.59375,
-4.49609375,
-0.256591796875,
1.2685546875,
-1.08203125,
2.7265625,
-2.462890625,
-2.458984375,
-0.75244140625,
-2.580078125,
2.091796875,
-0.9775390625,
-1.5498046875,
0.0426025390625,
1.310546875,
-0.474609375,
-1.890625,
-1.8017578125,
1.0390625,
1.7705078125,
2.2578125,
0.0193939208984375,
-1.2060546875,
2.185546875,
-0.1024169921875,
-4.2421875,
-2.8984375,
14.5,
1.5693359375,
-0.2239990234375,
0.0667724609375,
-0.01247406005859375,
-2.080078125,
-2.763671875,
-1.7333984375,
1.400390625,
3.15234375,
-1.2294921875,
0.31787109375,
1.955078125,
-0.019195556640625,
-1.8046875,
-0.2406005859375,
0.55615234375,
-0.5419921875,
-0.56591796875,
-0.59130859375,
5.7578125,
-1.2529296875,
-1.9443359375,
0.70751953125,
0.139404296875,
0.0149383544921875,
-1.08984375,
-2.708984375,
1.634765625,
1.427734375,
-0.876953125,
2.232421875,
0.1956787109375,
5.109375,
-0.64794921875,
-1.375,
-2.212890625,
-5.703125,
1.4384765625,
0.5537109375,
0.391357421875,
-0.09307861328125,
0.08685302734375,
1.76171875,
0.962890625,
1.21875,
-0.67041015625,
-1.9208984375,
-2.583984375,
-1.134765625,
-1.8994140625,
-0.39990234375,
-0.2890625,
-0.408447265625,
-1.091796875,
3.861328125,
-1.1318359375,
1.341796875,
-2.39453125,
-1.0146484375,
0.368896484375,
-0.69580078125,
1.935546875,
-0.1558837890625,
-3.09375,
1.8701171875,
-1.826171875,
3.20703125,
2.392578125,
-0.6611328125,
-0.242919921875,
0.29638671875,
-3.341796875,
0.7587890625,
-0.65966796875,
0.470703125,
-0.41796875,
-5.9296875,
-1.044921875,
0.705078125,
-4.4296875,
-1.68359375,
-1.5732421875,
1.375,
-2.103515625,
-0.86474609375,
-1.5654296875,
0.09912109375,
0.468994140625,
1.974609375,
-3.095703125,
-0.466064453125,
-1.5390625,
-2.994140625,
-0.4033203125,
0.259033203125,
0.828125,
2.90234375,
0.292236328125,
-0.48681640625,
-2.771484375,
2.119140625,
2.072265625,
1.00390625,
-0.623046875,
2.9296875,
-0.484375,
2.642578125,
-3.279296875,
0.243408203125,
-0.69287109375,
0.461181640625,
4.0625,
-0.10601806640625,
0.137451171875,
-0.1119384765625,
-0.8828125,
-2.24609375,
3.009765625,
-2.404296875,
0.66650390625,
1.4150390625,
-0.90234375,
-1.8017578125,
-3.001953125,
1.2080078125,
1.7578125,
0.6123046875,
-2.953125,
-0.26171875,
-2.787109375,
0.9013671875,
-1.068359375,
1.1474609375,
-0.3720703125,
1.6796875,
2.498046875,
1.326171875,
0.5966796875,
2.892578125,
2.666015625,
2.791015625,
-1.685546875,
-0.390380859375,
1.671875,
1.9208984375,
-0.31689453125,
-1.9423828125,
-0.1123046875,
-0.05828857421875,
1.3564453125,
-1.166015625,
2.103515625,
0.333740234375,
0.7138671875,
-0.04351806640625,
-0.5048828125,
-0.1485595703125,
0.56005859375,
3.4765625,
-0.86376953125,
2.4140625,
-1.8984375,
-0.44873046875,
-2.984375,
-2.21484375,
1.990234375,
-2.044921875,
0.35693359375,
-0.2666015625,
1.5888671875,
0.61669921875,
-0.6904296875,
0.30810546875,
-1.751953125,
-4.125,
1.8974609375,
-0.794921875,
-0.8662109375,
-1.578125,
0.587890625,
2.564453125,
-1.33984375,
1.798828125,
-0.96044921875,
-0.1142578125,
1.9072265625,
1.0478515625,
0.826171875,
1.3115234375,
-1.2705078125,
1.0732421875,
0.450927734375,
0.361083984375,
-0.71923828125,
-1.5986328125,
1.82421875,
0.98291015625,
1.99609375,
-1.7880859375,
-2.15625,
-2.869140625,
2.404296875,
0.46142578125,
3.173828125,
-0.81103515625,
0.432861328125,
-3.97265625,
-2.646484375,
0.4931640625,
1.0146484375,
-0.217041015625,
-0.364990234375,
-0.794921875,
-0.7568359375,
0.4267578125,
-0.3193359375,
-0.7998046875,
-3.75390625,
1.9794921875,
-0.73486328125,
-1.7392578125,
0.7958984375,
-0.71875,
-1.072265625,
0.422607421875,
1.1416015625,
-1.0986328125,
1.46875,
-0.09625244140625,
-5.26171875,
0.48486328125,
-1.0029296875,
-1.98046875,
0.1612548828125,
1.291015625,
-1.0283203125,
0.72705078125,
0.6015625,
-0.2061767578125,
0.98681640625,
2.453125,
-0.5419921875,
-0.5927734375,
2.54296875,
0.11236572265625,
-2.93359375,
-1.0234375,
-1.2216796875,
-0.190673828125,
3.1953125,
-0.10552978515625,
0.9755859375,
1.76953125,
-0.5703125,
-0.578125,
1.490234375,
0.320556640625,
-1.0166015625,
-0.10064697265625,
-0.83984375,
0.385986328125,
1.4951171875,
0.07440185546875,
-1.21875,
1.4921875,
-2.021484375,
-1.7744140625,
16.328125,
1.0419921875,
3.314453125,
3.0,
-0.83154296875,
-0.02728271484375,
0.49951171875,
5.44140625,
0.08636474609375,
-2.93359375,
-0.365478515625,
-0.50927734375,
-0.39892578125,
-2.533203125,
2.71484375,
-0.04461669921875,
-0.14892578125,
0.89208984375,
-3.880859375,
-1.71484375,
-1.0634765625,
1.7685546875,
-1.2451171875,
0.6455078125,
1.578125,
0.75634765625,
0.337890625,
-1.4501953125,
1.8056640625,
1.9345703125,
0.43896484375,
-0.050048828125,
1.46484375,
-0.44140625,
-0.48779296875,
0.11676025390625,
-0.455078125,
-1.13671875,
-2.76171875,
0.6904296875,
-0.86474609375,
-0.810546875,
-3.609375,
1.814453125,
0.033721923828125,
-0.18505859375,
1.3662109375,
-0.51318359375,
-0.9697265625,
-1.92578125,
0.3916015625,
-2.140625,
-0.9775390625,
-0.30615234375,
0.426513671875,
-1.546875,
1.5810546875,
0.875,
0.329833984375,
-1.7705078125,
1.7431640625,
-0.2042236328125,
1.017578125,
-1.0927734375,
-0.2353515625,
1.23046875,
-0.90673828125,
1.0400390625,
-0.1156005859375,
-1.125,
0.09307861328125,
1.974609375,
-1.7744140625,
-0.401123046875,
0.60205078125,
-1.205078125,
1.2255859375,
0.245849609375,
-0.751953125,
1.0400390625,
2.662109375,
-2.431640625,
-2.583984375,
1.693359375,
-1.5869140625,
-0.658203125,
0.345458984375,
0.44384765625,
0.287109375,
1.2607421875,
2.5078125,
0.76123046875,
0.7041015625,
0.066650390625,
0.32373046875,
0.25048828125,
-1.14453125,
10.3359375,
-1.21484375,
-0.2171630859375,
2.1953125,
-4.56640625,
-0.44091796875,
1.955078125,
-0.75390625,
0.9560546875,
-0.48388671875,
1.7587890625,
-0.794921875,
0.056671142578125,
-1.4892578125,
1.9287109375,
-0.345703125,
-9.046875,
-2.33984375,
2.28125,
-2.134765625,
-0.254638671875,
0.7138671875,
3.73046875,
-1.0087890625,
-1.9736328125,
-2.85546875,
-1.525390625,
2.19921875,
0.41357421875,
-1.255859375,
-1.1845703125,
-0.242431640625,
-0.055877685546875,
-1.341796875,
-2.2421875,
1.8291015625,
-0.29443359375,
0.254638671875,
0.69189453125,
-2.33984375,
1.021484375,
0.53955078125,
-0.042572021484375,
-0.156005859375,
2.779296875,
1.3447265625,
-2.751953125,
-2.984375,
0.71728515625,
3.1640625,
-0.96484375,
1.7802734375,
-1.0458984375,
-1.984375,
0.1636962890625,
0.8251953125,
-2.544921875,
1.2333984375,
-0.8779296875,
-0.1053466796875,
0.135498046875,
1.8955078125,
-1.1123046875,
0.9814453125,
-2.064453125,
-0.2479248046875,
1.4873046875,
0.1307373046875,
0.63037109375,
0.0173492431640625,
-0.974609375,
-1.1767578125,
-1.291015625,
1.3740234375,
0.71875,
3.734375,
1.4716796875,
-3.2734375,
-1.638671875,
-0.65625,
-0.98095703125,
2.451171875,
3.416015625,
3.556640625,
-0.1214599609375,
-0.71630859375,
2.189453125,
-1.29296875,
2.099609375,
0.2100830078125,
-0.353515625,
-0.94921875,
-3.447265625,
-0.83056640625,
-0.349853515625,
1.45703125,
4.9140625,
-1.2890625,
-1.0078125,
-1.1044921875,
1.087890625,
-0.2037353515625,
4.28515625,
1.814453125,
1.31640625,
-0.95458984375,
1.654296875,
-2.333984375,
0.258544921875,
0.272216796875,
2.3984375,
-4.65625,
-1.0068359375,
-0.62255859375,
1.47265625,
0.77001953125,
1.396484375,
-0.031524658203125,
0.82373046875,
1.8818359375,
-0.681640625,
0.83251953125,
-1.728515625,
-0.80224609375,
-0.50732421875,
-1.6875,
-2.099609375,
-0.142578125,
2.14453125,
-2.443359375,
0.0430908203125,
-2.5078125,
1.826171875,
2.048828125,
-0.49169921875,
-0.31982421875,
1.505859375,
-0.45751953125,
-2.255859375,
-1.201171875,
1.3330078125,
-0.96142578125,
1.9541015625,
1.7890625,
-0.96728515625,
-2.16015625,
0.5625,
-0.85009765625,
2.181640625,
0.525390625,
-0.272705078125,
1.6142578125,
0.494140625,
15.6484375,
-0.427978515625,
-0.08184814453125,
-0.28369140625,
-1.236328125,
1.0712890625,
-0.7841796875,
-2.5703125,
-1.5283203125,
-1.12890625,
0.8642578125,
4.51953125,
-2.630859375,
0.7666015625,
0.6171875,
1.1572265625,
0.4560546875,
5.0234375,
2.23046875,
2.306640625,
5.6796875,
1.953125,
-0.42578125,
-1.3330078125,
-1.044921875,
0.1165771484375,
1.7626953125,
2.017578125,
1.380859375,
1.4560546875,
-0.09979248046875,
0.73388671875,
1.0048828125,
2.439453125,
1.4970703125,
-1.353515625,
-0.42236328125,
2.333984375,
1.57421875,
-0.70361328125,
0.55078125,
1.4140625,
3.44140625,
0.8583984375,
0.896484375,
1.548828125,
0.1866455078125,
-0.318359375,
1.01171875,
-1.05859375,
-1.0126953125,
0.61083984375,
0.351318359375,
2.009765625,
-1.2451171875,
-1.1845703125,
-1.3671875,
-0.96142578125,
0.342529296875,
-1.58203125,
-0.2293701171875,
0.80908203125,
-1.392578125,
-3.060546875,
0.29736328125,
0.3056640625,
0.39404296875,
-1.1611328125,
-0.1417236328125,
0.64306640625,
0.247314453125,
2.197265625,
-2.498046875,
-0.270751953125,
0.439697265625,
-1.50390625,
-2.037109375,
-0.43896484375,
1.484375,
2.185546875,
0.385986328125,
4.4375,
-2.228515625,
-1.3427734375,
-1.9091796875,
-3.6171875,
1.7216796875,
-1.314453125,
1.4111328125,
-1.47265625,
0.9140625,
0.56982421875,
-2.755859375,
0.74853515625,
-0.4638671875,
-2.259765625,
0.9677734375,
-2.8828125,
2.19140625,
1.603515625,
0.44775390625,
2.23046875,
1.5517578125,
-2.5078125,
0.78759765625,
-0.5302734375,
-1.9755859375,
1.8974609375,
-0.294677734375,
-0.467041015625,
0.271240234375,
-2.17578125,
0.46240234375,
1.365234375,
-0.8701171875,
0.42529296875,
1.154296875,
0.40869140625,
-3.4765625,
1.724609375,
1.5126953125,
-1.1455078125,
-2.87109375,
2.9921875,
0.2413330078125,
2.1015625,
-1.1865234375,
0.16552734375,
1.77734375,
0.03497314453125,
2.05078125,
0.10076904296875,
-2.5859375,
-0.15283203125,
-0.6142578125,
-0.7919921875,
2.119140625,
0.58154296875,
-1.6416015625,
0.857421875,
-1.2109375,
-1.322265625,
1.4384765625,
0.83154296875,
0.34912109375,
-0.1492919921875,
-0.374755859375,
0.63818359375,
-1.212890625,
-2.9921875,
-0.0931396484375,
-2.990234375,
0.94287109375,
2.98046875,
0.04119873046875,
-1.978515625,
-2.84765625,
-0.70703125,
0.177490234375,
-0.1644287109375,
2.28515625,
0.4033203125,
1.9638671875,
-4.05859375,
-0.54931640625,
-1.2236328125,
1.8427734375,
-1.80078125,
0.06829833984375,
-1.240234375,
-0.9814453125,
-0.97998046875,
1.0478515625,
0.06341552734375,
-8.8046875,
-3.48828125,
2.54296875,
0.8037109375,
-1.2783203125,
-0.61279296875,
0.62109375,
-2.28125,
0.70166015625,
0.395751953125,
-0.224609375,
-1.3251953125,
1.041015625,
-0.62353515625,
-1.66015625,
-1.6123046875,
0.86669921875,
0.724609375,
-1.5869140625,
1.087890625,
1.3232421875,
1.119140625,
0.94921875,
-2.244140625,
0.34228515625,
0.437255859375,
0.137939453125,
1.353515625,
2.119140625,
1.5927734375,
-0.41796875,
-1.8857421875,
-1.0810546875,
-0.1409912109375,
0.0791015625,
2.3203125,
1.5732421875,
-1.2470703125,
-1.6669921875,
1.6748046875,
2.58984375,
-6.12890625,
0.185302734375,
0.580078125,
0.9541015625,
-1.984375,
1.4814453125,
0.29541015625,
2.232421875,
1.6826171875,
0.716796875,
1.2763671875,
-4.48828125,
-1.5078125,
-3.189453125,
-0.86328125,
0.042266845703125,
1.892578125,
0.537109375,
-0.814453125,
1.083984375,
2.03125,
0.49658203125,
-0.57666015625,
3.447265625,
1.76171875,
-0.465576171875,
-1.0673828125,
-0.298828125,
-1.166015625,
-0.138427734375,
-3.3828125,
-1.0859375,
-1.9541015625,
2.189453125,
-1.095703125,
-1.20703125,
-0.595703125
],
"evol_instruction_score": 7.0,
"evol_response_score": 7.0,
"evolved_instruction": "Enhance the existing Python script by identifying and improving performance bottlenecks, incorporating comprehensive exception handling, and refining algorithmic complexities.",
"evolved_response": "To enhance the existing Python script, we can follow the steps mentioned below:\n\n1. Identify and improve performance bottlenecks:\n- Use profiling tools like cProfile to identify the parts of the script that are taking the most time to execute.\n- Once identified, consider optimizing the code by using more efficient data structures, algorithms, or techniques.\n- Utilize libraries like NumPy for numerical computations or Pandas for data manipulation, which can significantly improve performance.\n\n2. Incorporate comprehensive exception handling:\n- Add try-except blocks to handle exceptions gracefully and provide meaningful error messages to the user.\n- Consider using specific exception handling to catch and handle different types of errors appropriately.\n- Use logging to capture and report errors for debugging purposes.\n\n3. Refine algorithmic complexities:\n- Analyze the existing algorithms used in the script and identify opportunities for optimization.\n- Consider using more efficient algorithms or data structures to reduce time complexity and improve overall performance.\n- Break down complex algorithms into smaller, more manageable functions for better readability and maintainability.\n\nBy following these steps, we can enhance the existing Python script by improving its performance, incorporating comprehensive exception handling, and refining algorithmic complexities.",
"meta": {
"category": "Natural language to code",
"completion": "The original code counts the number of lowercase vowels in a string, but the count should probably be case-insensitive. Here is a case-insensitive version.\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i].toLowerCase())) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"id": 6,
"input": null,
"motivation_app": null,
"prompt": "Can you find and correct any logical errors in the following code snippet and output the corrected code?\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i])) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"source": "surge",
"subcategory": "Debugging"
},
"model_name": "gpt-3.5-turbo",
"nearest_neighbor_distance": 0.1439239392804652
}
```
This subset can be loaded as:
```python
from datasets import load_dataset
ds = load_dataset("distilabel-internal-testing/deita-colab", "deita_filtering")
```
</details>
|
IneedtolearnDL/Codedata | ---
license: mit
---
|
Norquinal/OpenCAI | ---
license: cc-by-nc-4.0
language:
- en
size_categories:
- 10K<n<100K
---
This dataset is comprised of roleplay chat conversations scraped from several Discord RP fandom servers. The conversations have been split in terms of days, the assumption being that a majority of long-form roleplays are started/continued and completed within a day.
The original dataset consisted of ~90K samples. Light filtering striped that down to ~18K samples. Stricter filtering striped it down to ~8k samples. Strictest filtering striped it down to ~2k samples.
Some effort was made to remove OOC, links, and other miscellanous fluff, but more work still needs to be done. This isn't a "completed" dataset so much as a test to see if the data gathered is conducive to training LLMs for roleplay purposes. If determined to be useful, I will continue to scrape more data.
In here are several files:
* `discord_rp_with_token_counts.json` - The original dataset in all its unprocessed glory. ~90k items. Total Average Token Length for all items: ~143 tokens.
* `125_tokens_10_messages_discord_rp.json` (Strictest) - Original dataset filtered for an average token length of 125 and a minimum conversation length of 10 messages. Mostly unprocessed. Average Length: 205 tokens.
* `80_tokens_6_messages_discord_rp.json` (Stricter) - Original dataset filtered for an average token length of 80 tokens and a minimum conversation length of 6 messages. Mostly unprocessed. Average Length: 181 tokens. The latter contains the former, so use one or the other, but not both.
* `80_tokens_3_messages_discord_rp.json` (Light) - Original dataset filtered for an average token length of 80 tokens and a minimum conversation length of 3 messages. Mostly unprocessed. Average Length: 202 tokens. The latter contains the former, so use one or the other, but not both.
* `opencai_rp.json` - Original dataset filtered for an average token length of 125 tokens and a minimum conversation length of 10 messages, then processed. Contains descriptions of characters, summary, scene, and genre tags provided by `gpt-3.5-turbo-16k`.
* `opencai_rp_metharme.json` - Original dataset filtered for an average token length of 125 tokens and a minimum conversation length of 10 messages, then processed, filtered to 1229 samples, and converted to metharme format.
Explanation of Properties:
* `timestamp` - Date of the interaction in YYYY-MM-DD format
* `conversations`: The conversation between the users in the chat. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing three key-value pairs: `message`, referring to the utterance itself, `author` referring to their Discord username, and `is_bot`, which designates whether the message was sent by a human or a bot. `is_bot` was determined by checking if author still had a discriminator and, therefore, isn't 100% accurate. |
AdapterOcean/med_alpaca_standardized_cluster_16_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10211517
num_examples: 7409
download_size: 5503691
dataset_size: 10211517
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_16_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sooolee/rexthetoy | ---
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
RexTheToy
### Dataset Summary
Images of the rex the toy from Toy Story, stored for huggingface dreambooth hackathon.
|
jondurbin/airoboros-gpt4-1.1 | ---
license: cc-by-nc-4.0
---
The data was generated by gpt-4, and therefore is subject to OpenAI ToS. The tool used to generate the data [airoboros](https://github.com/jondurbin/airoboros) is apache-2.
Specific areas of focus for this training data:
* trivia
* math
* nonsensical math
* coding
* closed context question answering
* closed context question answering, with multiple contexts to choose from as confounding factors
* writing
* multiple choice
This is largely an overlap of the original [dataset](https://huggingface.co/datasets/jondurbin/airoboros-gpt4), but with a few extras:
* fixed contextual entries that were missing closing tags (e.g. "ENDINPUT", "ENDINSTRUCTION", etc.)
* fixed an issue where source information was provided, even if not asked (the model always tried to provide source info)
* added some questions that were unrelated to the provided context, to train the model to say when it can't provide an answer
* added several new contexual instructions, including some with FAQ style to hopefully prevent questions in the context from breaking the inference
* hundreds more coding samples, focusing primarily on python, java, javascript, c/c++, and golang
### Usage and License Notices
All airoboros models and datasets are intended and licensed for research use only. I've used the 'cc-nc-4.0' license, but really it is subject to a custom/special license because:
- the base model is LLaMa, which has it's own special research license
- the dataset(s) were generated with OpenAI (gpt-4 and/or gpt-3.5-turbo), which has a clausing saying the data can't be used to create models to compete with openai
So, to reiterate: this model (and datasets) cannot be used commercially. |
Elen22/Armenian-news-sentiment-dataset | ---
license: apache-2.0
---
# Hugging Face Armenian News Sentiment Dataset Repository
## Introduction
This repository contains a collection of Armenian texts annotated with sentiment information (negative, positive, or negative). The collection contains training, development and test sets, and was created at the Center of Advanced Technologies at Russian-Armenian University.
## Training and Development Data
For training and development datasets, we translated the [Stanford Sentiment Treebank](https://github.com/stanfordnlp/sentiment-treebank), [MELD](https://github.com/declare-lab/MELD/tree/master/data/MELD), and [SLSD](https://archive.ics.uci.edu/dataset/331/sentiment+labelled+sentences) datasets into Armenian using Google Translate.
The translated data was split between train/dev linearly by hand at 80%/20%.
| Dataset | Total Examples | Negative | Neutral | Positive |
|---------|---------------|---------|---------|----------|
| Training | 15,983 | 4,574 | 5,773 | 5,636 |
| Development | 4,179 | 1,184 | 1,558 | 1,437 |
## Test Dataset
The test dataset consists of Armenian news articles collected from various Armenian news pages. Each news article is annotated by a native speaker with a sentiment label indicating whether the sentiment expressed in the article is negative (0), neutral (1), or positive (2).
The sentiment labels were assigned based on natural language understanding and the subjective judgment of the annotators. Each example was processed by at least 2 annotators. We have included only those examples in the dataset where all annotators unanimously agreed on the sentiment labels.
The test set contains 956 examples in total (295 negative, 476 neutral, 185 positive).
## Repository Information
- The repository contains the annotated train, development and test datasets. The authors hope that the resources provided in this repository can serve as a starting point for researchers and developers interested in sentiment analysis for the Armenian language.
- The dataset was created by Elen Petikyan, with the help of a group of annotators (the full list of contributors is provided in CONTRIBUTING.md). The dataset creation process was supervised by Tsolak Ghukasyan.
## Contact Information
For any inquiries or feedback related to this dataset repository, please contact **Elen Petikyan** (ellen.petikyan@gmail.com).
|
sno0owing/kitti_seg | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 327499999.0
num_examples: 1000
download_size: 327439822
dataset_size: 327499999.0
---
<html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>nginx</center>
</body>
</html> |
reach-vb/test-ali | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 121033327.0
num_examples: 36
download_size: 120075953
dataset_size: 121033327.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/nachi_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nachi/那智/那智 (Kantai Collection)
This is the dataset of nachi/那智/那智 (Kantai Collection), containing 415 images and their tags.
The core tags of this character are `long_hair, side_ponytail, black_hair, brown_eyes, very_long_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 415 | 349.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 415 | 236.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 898 | 465.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 415 | 322.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 898 | 593.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nachi_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_skirt, cowboy_shot, looking_at_viewer, pencil_skirt, solo, white_gloves, white_pantyhose, simple_background, white_background, hand_on_hip, military_uniform, long_sleeves |
| 1 | 10 |  |  |  |  |  | 1girl, elbow_gloves, solo, skirt, white_gloves, brown_hair, white_pantyhose, looking_at_viewer, turret |
| 2 | 23 |  |  |  |  |  | military_uniform, 1girl, hair_between_eyes, solo, upper_body, white_gloves, simple_background, looking_at_viewer, long_sleeves, white_background, jacket, blush |
| 3 | 6 |  |  |  |  |  | 1girl, solo, upper_body, white_shirt, collared_shirt, hair_between_eyes, blush, dress_shirt, long_sleeves, looking_at_viewer, simple_background, smile, white_background |
| 4 | 12 |  |  |  |  |  | 1girl, playboy_bunny, solo, fake_animal_ears, rabbit_ears, detached_collar, cleavage, blush, large_breasts, looking_at_viewer, rabbit_tail, simple_background, wrist_cuffs, black_pantyhose, hair_between_eyes, strapless_leotard, cowboy_shot, necktie |
| 5 | 6 |  |  |  |  |  | 1girl, navel, solo, black_bikini, blush, looking_at_viewer, cleavage, large_breasts, smile, dated, gloves, simple_background |
| 6 | 5 |  |  |  |  |  | 1girl, alternate_costume, solo, yukata, looking_at_viewer, obi, twitter_username, fireworks, floral_print, one-hour_drawing_challenge, purple_kimono, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | cowboy_shot | looking_at_viewer | pencil_skirt | solo | white_gloves | white_pantyhose | simple_background | white_background | hand_on_hip | military_uniform | long_sleeves | elbow_gloves | skirt | brown_hair | turret | hair_between_eyes | upper_body | jacket | blush | white_shirt | collared_shirt | dress_shirt | smile | playboy_bunny | fake_animal_ears | rabbit_ears | detached_collar | cleavage | large_breasts | rabbit_tail | wrist_cuffs | black_pantyhose | strapless_leotard | necktie | navel | black_bikini | dated | gloves | alternate_costume | yukata | obi | twitter_username | fireworks | floral_print | one-hour_drawing_challenge | purple_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------|:--------------------|:---------------|:-------|:---------------|:------------------|:--------------------|:-------------------|:--------------|:-------------------|:---------------|:---------------|:--------|:-------------|:---------|:--------------------|:-------------|:---------|:--------|:--------------|:-----------------|:--------------|:--------|:----------------|:-------------------|:--------------|:------------------|:-----------|:----------------|:--------------|:--------------|:------------------|:--------------------|:----------|:--------|:---------------|:--------|:---------|:--------------------|:---------|:------|:-------------------|:------------|:---------------|:-----------------------------|:----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | | X | | X | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 23 |  |  |  |  |  | X | | | X | | X | X | | X | X | | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | X | | X | | | X | X | | | X | | | | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | X | X | | X | | | X | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | X | | | X | | | | | | | | | | | | X | | | | X | | | | | X | X | | | | | | X | X | X | X | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE | ---
pretty_name: Evaluation run of Isotonic/Mixnueza-6x32M-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Isotonic/Mixnueza-6x32M-MoE](https://huggingface.co/Isotonic/Mixnueza-6x32M-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T02:58:26.903006](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE/blob/main/results_2024-04-03T02-58-26.903006.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2559514063264178,\n\
\ \"acc_stderr\": 0.030612915671212777,\n \"acc_norm\": 0.256379850516273,\n\
\ \"acc_norm_stderr\": 0.03139926274321708,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.47837127971295423,\n\
\ \"mc2_stderr\": 0.015611380743614477\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18003412969283278,\n \"acc_stderr\": 0.011227856729050028,\n\
\ \"acc_norm\": 0.21160409556313994,\n \"acc_norm_stderr\": 0.011935916358632868\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2634933280223063,\n\
\ \"acc_stderr\": 0.004396273173717457,\n \"acc_norm\": 0.26687910774746065,\n\
\ \"acc_norm_stderr\": 0.004414246720076111\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.037857144650666544,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.037857144650666544\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.03396116205845333,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.03396116205845333\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1676300578034682,\n\
\ \"acc_stderr\": 0.02848196303214339,\n \"acc_norm\": 0.1676300578034682,\n\
\ \"acc_norm_stderr\": 0.02848196303214339\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102963,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102963\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.03835153954399421,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.03835153954399421\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.31290322580645163,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.31290322580645163,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.03469713791704371,\n\
\ \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.03469713791704371\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.024243783994062167,\n\
\ \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.024243783994062167\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276611,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276611\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26238532110091745,\n \"acc_stderr\": 0.01886188502153473,\n \"\
acc_norm\": 0.26238532110091745,\n \"acc_norm_stderr\": 0.01886188502153473\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.28921568627450983,\n\
\ \"acc_stderr\": 0.03182231867647553,\n \"acc_norm\": 0.28921568627450983,\n\
\ \"acc_norm_stderr\": 0.03182231867647553\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035307,\n\
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n\
\ \"acc_stderr\": 0.030500283176545906,\n \"acc_norm\": 0.2914798206278027,\n\
\ \"acc_norm_stderr\": 0.030500283176545906\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824848,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824848\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.034057028381856924,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.034057028381856924\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.016246087069701397,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.016246087069701397\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475356,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475356\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818798,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886338,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886338\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2320730117340287,\n\
\ \"acc_stderr\": 0.01078204666590518,\n \"acc_norm\": 0.2320730117340287,\n\
\ \"acc_norm_stderr\": 0.01078204666590518\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2042483660130719,\n \"acc_stderr\": 0.01630975584836153,\n \
\ \"acc_norm\": 0.2042483660130719,\n \"acc_norm_stderr\": 0.01630975584836153\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.02671143055553841,\n\
\ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.02671143055553841\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.47837127971295423,\n\
\ \"mc2_stderr\": 0.015611380743614477\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5185477505919495,\n \"acc_stderr\": 0.014042813708888378\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245505\n }\n}\n```"
repo_url: https://huggingface.co/Isotonic/Mixnueza-6x32M-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-58-26.903006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-58-26.903006.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- '**/details_harness|winogrande|5_2024-04-03T02-58-26.903006.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T02-58-26.903006.parquet'
- config_name: results
data_files:
- split: 2024_04_03T02_58_26.903006
path:
- results_2024-04-03T02-58-26.903006.parquet
- split: latest
path:
- results_2024-04-03T02-58-26.903006.parquet
---
# Dataset Card for Evaluation run of Isotonic/Mixnueza-6x32M-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Isotonic/Mixnueza-6x32M-MoE](https://huggingface.co/Isotonic/Mixnueza-6x32M-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T02:58:26.903006](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE/blob/main/results_2024-04-03T02-58-26.903006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2559514063264178,
"acc_stderr": 0.030612915671212777,
"acc_norm": 0.256379850516273,
"acc_norm_stderr": 0.03139926274321708,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.47837127971295423,
"mc2_stderr": 0.015611380743614477
},
"harness|arc:challenge|25": {
"acc": 0.18003412969283278,
"acc_stderr": 0.011227856729050028,
"acc_norm": 0.21160409556313994,
"acc_norm_stderr": 0.011935916358632868
},
"harness|hellaswag|10": {
"acc": 0.2634933280223063,
"acc_stderr": 0.004396273173717457,
"acc_norm": 0.26687910774746065,
"acc_norm_stderr": 0.004414246720076111
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.037857144650666544,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.037857144650666544
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.03396116205845333,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.03396116205845333
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1676300578034682,
"acc_stderr": 0.02848196303214339,
"acc_norm": 0.1676300578034682,
"acc_norm_stderr": 0.02848196303214339
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102963,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102963
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03835153954399421,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03835153954399421
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.03469713791704371,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.03469713791704371
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35384615384615387,
"acc_stderr": 0.024243783994062167,
"acc_norm": 0.35384615384615387,
"acc_norm_stderr": 0.024243783994062167
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276611,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276611
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26238532110091745,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.26238532110091745,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.030500283176545906,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.030500283176545906
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824848,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824848
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.034057028381856924,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.034057028381856924
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.016246087069701397,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.016246087069701397
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475356,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475356
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818798,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886338,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886338
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2320730117340287,
"acc_stderr": 0.01078204666590518,
"acc_norm": 0.2320730117340287,
"acc_norm_stderr": 0.01078204666590518
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2042483660130719,
"acc_stderr": 0.01630975584836153,
"acc_norm": 0.2042483660130719,
"acc_norm_stderr": 0.01630975584836153
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.02671143055553841,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.02671143055553841
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.47837127971295423,
"mc2_stderr": 0.015611380743614477
},
"harness|winogrande|5": {
"acc": 0.5185477505919495,
"acc_stderr": 0.014042813708888378
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245505
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kursathalat/multilabel_ds | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for multilabel_ds
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("kursathalat/multilabel_ds")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("kursathalat/multilabel_ds")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | Label | multi_label_selection | True | N/A | ['admiration', 'amusement', 'anger', 'annoyance', 'approval', 'caring', 'confusion', 'curiosity', 'desire', 'disappointment', 'disapproval', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'grief', 'joy', 'love', 'nervousness', 'optimism', 'pride', 'realization', 'relief', 'remorse', 'sadness', 'surprise', 'neutral'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"text": " \"If you don\u0027t wear BROWN AND ORANGE...YOU DON\u0027T MATTER!\" We need a tshirt with that on it asap! "
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "525f4274-ebb4-4aee-a116-f8b422b2e2b4",
"values": {
"label": {
"value": [
"approval",
"neutral"
]
}
}
}
],
"suggestions": [
{
"agent": null,
"question_name": "label",
"score": null,
"type": "human",
"value": [
"annoyance",
"neutral"
]
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"label": [
{
"status": "submitted",
"user_id": "525f4274-ebb4-4aee-a116-f8b422b2e2b4",
"value": [
"approval",
"neutral"
]
}
],
"label-suggestion": [
"annoyance",
"neutral"
],
"label-suggestion-metadata": {
"agent": null,
"score": null,
"type": "human"
},
"metadata": "{}",
"text": " \"If you don\u0027t wear BROWN AND ORANGE...YOU DON\u0027T MATTER!\" We need a tshirt with that on it asap! "
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **label** is of type `multi_label_selection` with the following allowed values ['admiration', 'amusement', 'anger', 'annoyance', 'approval', 'caring', 'confusion', 'curiosity', 'desire', 'disappointment', 'disapproval', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'grief', 'joy', 'love', 'nervousness', 'optimism', 'pride', 'realization', 'relief', 'remorse', 'sadness', 'surprise', 'neutral'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **label-suggestion** is of type `multi_label_selection` with the following allowed values ['admiration', 'amusement', 'anger', 'annoyance', 'approval', 'caring', 'confusion', 'curiosity', 'desire', 'disappointment', 'disapproval', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'grief', 'joy', 'love', 'nervousness', 'optimism', 'pride', 'realization', 'relief', 'remorse', 'sadness', 'surprise', 'neutral'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a text classification dataset that contains texts and labels. Given a set of texts and a predefined set of labels, the goal of text classification is to assign one or more labels to each text based on its content. Please classify the texts by making the correct selection.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TheFinAI/flare-sm-acl | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 70385369
num_examples: 20781
- name: valid
num_bytes: 9049127
num_examples: 2555
- name: test
num_bytes: 13359338
num_examples: 3720
download_size: 46311736
dataset_size: 92793834
---
# Dataset Card for "flare-sm-acl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/9497_Images_OCR_Data_of_10_Types_of_Forms | ---
license: cc-by-nc-nd-4.0
---
## Description
9,497 Images - OCR Data of 10 Types of Forms. Rectangular bounding boxes were adopted to annotate forms. The data can be used for tasks such as forms detection.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1056?source=Huggingface
## Date size
9,497 images, 10 types of forms
## Collecting environment
pure color background
## Data diversity
multiple types of forms
## Data format
the image data format is .jpg , the annotation file format is .json
## Annotation content
rectangular bounding boxes of forms
## Accuracy
The error bound of each vertex of rectangular bounding box is within 5 pixels, which is a qualified annotation, the accuracy of rectangular bounding boxes is not less than 95%
# Licensing Information
Commercial License
|
elricwan/oig | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 47893346745
num_examples: 14430253
download_size: 25698713382
dataset_size: 47893346745
---
# Dataset Card for "oig"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Trofish/Korean-RLHF-Full-process | ---
license: cc-by-nc-4.0
task_categories:
- reinforcement-learning
- text-generation
language:
- ko
tags:
- RLHF
- SFT
- RM
- instruction-tuning
- reward-model
- PPO
---
## KULLM을 baseline으로 RLHF 강화학습을 하는데 사용한 데이터셋입니다.
- **Step1: step1_SFT_train.jsonl** (KULLM 12.8B 모델을 Supervised Fine-Tuning 하는데 사용하였습니다.)
- **Step2: step2_RM_train.json** (polyglot-ko 1.3B 모델을 Reward Model로 학습하는데 사용하였습니다.)
- **Step3: step3_PPO_train.json** (SFT 모델과 RM 모델을 사용하여 RLHF 학습을 하는데 사용하였습니다.)
자세한 정보는 다음을 참고해주세요: https://huggingface.co/Trofish/KULLM-RLHF
## 강화학습 단계별 데이터셋 구축


## 데이터셋 선정 시 고려 사항
- **일상 대화와 혐오 표현 대처 능력을 올리기 위한 데이터셋과, 학습 시 챗봇 모델의 general한 task에 대한 성능이 하락하는 것을 막기 위해서 general task 데이터셋을 구성**
- **국립국어원 일상 대화 데이터셋:** 일상적인 대화에 대한 자연스러운 응답이 있으면서도, 맞춤법이 잘 지켜지고 은어, 비문, 초성 등이 없으며 주제별로 다양한 대화가 있음
- **AI Hub 혐오 표현 데이터셋:** 혐오, 차별, 성적인 내용, 폭력, 범죄 등 카테고리별로 다양한 혐오 표현이 있음
- **General task 데이터셋**
- Evol-Instruct 데이터셋: 다양한 분야에 대한 복잡하고 논리적인 prompt와 답변이 있음
- Self-Instruct 데이터셋: 사람이 직접 생성한 양질의 Seed data를 기반으로 데이터 증강
- RLHF 한국어 번역 데이터셋: DeepSpeedChat에서 공개한 데이터셋을 한국어로 번역
# Step1. SFT 모델 Fine-tuning
## Baseline Model
[- 고려대학교 NLP & AI 연구실과 HIAI 연구소가 개발한 한국어 LLM **"KULLM"** 사용](https://github.com/nlpai-lab/KULLM)
## Datasets

# Step2. Reward Model ver1 구현
## Baseline Model
- EleutherAI에서 개발한 초거대 한국어 언어 모델 **Polyglot-Ko** 사용
- 1.3b 모델과 5.8b 모델을 각각 실험
## Datasets

- InstructGPT의 데이터셋 구축 방법
- Reward 모델 학습 데이터셋으로 SFT 학습에 사용한 prompt(1,500개 - 일상대화:혐오표현=2:1)와 새로운 prompt(1,000개 - DeepSpeedChat 번역 데이터셋) 사용
- SFT 모델에서 한개의 prompt당 K개의 Response를 생성하고, 순위를 Labeling
- 데이터셋 라벨링
- Instruct GPT의 경우 사람이 직접 Labeling을 하엿지만, 일관된 평가와 시간 단축을 위해 GPt-4와 G-Eval을 이용
- SFT에서 생성한 두 Response 중 G-Eval 평가 점수 합이 높은 것을 Chosen response로 결정
- 데이터셋 유형별로 G-Eval 평가 Prompt에 차이를 두었음
- 
## RLFH Model Evaluation


## Final RLHF Model
- https://huggingface.co/Trofish/KULLM-RLHF
|
AIRI-NLP/quality_counter_new_5120 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 558587914
num_examples: 20000
- name: validation
num_bytes: 221539952
num_examples: 8000
- name: test
num_bytes: 56238158
num_examples: 2300
download_size: 26660389
dataset_size: 836366024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
HydraLM/partitioned_v2_standardized_8 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 59591177.03676694
num_examples: 124197
download_size: 10388424
dataset_size: 59591177.03676694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-sasha__dog-food-8a6c4abe-13775897 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- sasha/dog-food
eval_info:
task: image_binary_classification
model: abhishek/autotrain-dog-vs-food
metrics: ['matthews_correlation']
dataset_name: sasha/dog-food
dataset_config: sasha--dog-food
dataset_split: train
col_mapping:
image: image
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Image Classification
* Model: abhishek/autotrain-dog-vs-food
* Dataset: sasha/dog-food
* Config: sasha--dog-food
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ahmetgunduz](https://huggingface.co/ahmetgunduz) for evaluating this model. |
heliosprime/twitter_dataset_1713043343 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13383
num_examples: 31
download_size: 10269
dataset_size: 13383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713043343"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdrianaCasadei/CornGrain | ---
license: other
license_name: corngrain
license_link: LICENSE
---
|
Mainred/model | ---
license: unknown
---
|
sakulosa/wario-ita | ---
license: openrail
---
|
nhantruongcse/summary-vietnamese-news-token-TFeval_vit5_large | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 61526294
num_examples: 8229
download_size: 27275716
dataset_size: 61526294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-af1ac4-44796145150 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: Einmalumdiewelt/T5-Base_GNAD
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Einmalumdiewelt/T5-Base_GNAD
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@TotallyIntended](https://huggingface.co/TotallyIntended) for evaluating this model. |
cungnlp/sample_datacung | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2612
num_examples: 10
download_size: 3434
dataset_size: 2612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
umanlp/babelbert-dataset | ---
license: other
---
This dataset was created and used for the paper titled [Massively Multilingual Lexical Specialization of Multilingual Transformers](https://aclanthology.org/2023.acl-long.426/)
This dataset is a processed version of BabelNet v5.0 downloaded from https://babelnet.org, made available with the BabelNet Non-Commercial License (see https://babelnet.org/full-license). |
MariaIsabel/PROMISE_NFR_translated | ---
annotations_creators:
- other
language:
- es
language_creators:
- other
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: PROMISE NFR translated to Spanish
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Published version of PROMISE NFR translated to Spanish used for paper 'Requirements Classification Using FastText and BETO in Spanish Documents'
### Languages
Spanish
## Dataset Structure
### Data Fields
Project: Project's Identifier.
Requirement: Description of the software requirement.
Label: Label of the requirement: F (functional requirement) and NF (non-functional requirement).
## Dataset Creation
### Initial Data Collection and Normalization
PROMISE [http://promise.site.uottawa.ca/SERepository/] is a repository used in most requirements classification research. It has 84 datasets, and the one used for requirement classification is
NFR, with 625 requirement sentences, 255 requirements identified as functional and 370 as non-functional. We translated into Spanish.
## Additional Information
### Citation Information
https://doi.org/10.1007/978-3-031-29786-1_11
|
liupf/ChEBI-20-MM | ---
license: mit
task_categories:
- text-generation
- translation
tags:
- chemistry
- biology
---
# ChEBI-20-MM Dataset
## Overview
The ChEBI-20-MM is an extensive and multi-modal benchmark developed from the ChEBI-20 dataset. It is designed to provide a comprehensive benchmark for evaluating various models' capabilities in the field of molecular science. This benchmark integrates multi-modal data, including InChI, IUPAC, SELFIES, and images, making it a versatile tool for a wide range of molecular tasks.
## Dataset Description
ChEBI-20-MM is an expansion of the original ChEBI-20 dataset, with a focus on incorporating diverse modalities of molecular data. This benchmark is tailored to assess models in several key areas:
- **Molecule Generation**: Evaluating the ability of models to generate accurate molecular structures.
- **Image Recognition**: Testing models on their proficiency in converting molecular images into other representational formats.
- **IUPAC Recognition**: Evaluating the ability of models to generate IUPAC names from other representational formats.
- **Molecular Captioning**: Assessing the capability of models to generate descriptive captions for molecular structures.
- **Retrieval Tasks**: Measuring the effectiveness of models in retrieving molecular information accurately and efficiently.
## Utility and Significance
By expanding the data modality variety, this benchmark enables a more comprehensive evaluation of models' performance in multi-modal data handling.
## How to Use
Model reviews and evaluations related to this dataset can be directly accessed and used via the SLM4Mol link: [SLM4Mol](https://github.com/AI-HPC-Research-Team/SLM4Mol).
## Data Visualization
We employ visualization techniques to analyze the **suitability** of data sources for language models and **chemical space coverage**. The figure below illustrates our use of different visualization methods to analyze text length distributions and token counts generated by each model's tokenizer across various text data types. This approach evaluates the adaptability of language models to the textual characteristics of our dataset.

We also focus on the top 10 scaffolds within the dataset, counting the number of molecules for each scaffold. Here, semi-transparent bars represent the total count, while solid bars indicate the quantity in the training set. On the other hand, for the analysis of \textbf{chemical space coverage}, we choose molecular weight (MW), LogP, the number of aromatic rings, and the Topological Polar Surface Area (TPSA) as descriptors. We examine the distribution and correlation of these descriptors within the dataset, providing insights into the chemical diversity and complexity present in our data.
## Acknowledgments
The development of the ChEBI-20-MM dataset was inspired by the ChEBI-20 in molecule generation and captioning initiated by MolT5. Additional data information supplements are derived from PubChem.
|
fireinwind/GPT4_Instinwild_belle_cn_school_math_zdmqa | ---
license: apache-2.0
---
这个是数据集是整合了网上的一些公开数据集,包括alpaca_gpt4_data_zh,instinwild_ch,belle_cn_school_math以及zdm的一些精选的qa的内容,目的是在基本的SFT微调可以增强一些创造性,多样性,以及对数据,逻辑,数学的支持.最后加入的zdm的qa部分是为了能在商品的选品和"值得买"领域进行增强.
|
bleroni/kosovo_institution_names | ---
license: mit
---
List of Kosovo's local and international institutions |
AMead10/Arten | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 3737340287.0
num_examples: 673
download_size: 3619281514
dataset_size: 3737340287.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
legacy107/sentence_transformer_wikipedia_chunked | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer_start
dtype: int64
- name: answer
dtype: string
- name: article
dtype: string
- name: chunked_article
sequence: string
splits:
- name: train
num_bytes: 3734770114
num_examples: 27742
- name: test
num_bytes: 408448904
num_examples: 3468
- name: validation
num_bytes: 564192755
num_examples: 3458
download_size: 717817867
dataset_size: 4707411773
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "qa_wikipedia_sentence_transformer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
the-glendalorian/prompts-responses | ---
license: mit
---
|
AlekseyKorshuk/evaluation_dataset | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: user_id
dtype: string
- name: tokens
sequence: int64
splits:
- name: validation
num_bytes: 116428164
num_examples: 54881
- name: train
num_bytes: 465740765
num_examples: 219524
download_size: 320290001
dataset_size: 582168929
---
# Dataset Card for "evaluation_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_tr_conf_mgpt_farestscore_false | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87035
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mgpt_farestscore_false"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_180 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 965468660.0
num_examples: 189605
download_size: 986503346
dataset_size: 965468660.0
---
# Dataset Card for "chunk_180"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gonnascore/structuredlogs | ---
license: gpl-3.0
---
|
open-llm-leaderboard/details_weezywitasneezy__OxytocinErosEngineeringF1-7B-slerp | ---
pretty_name: Evaluation run of weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp](https://huggingface.co/weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_weezywitasneezy__OxytocinErosEngineeringF1-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T17:45:19.383720](https://huggingface.co/datasets/open-llm-leaderboard/details_weezywitasneezy__OxytocinErosEngineeringF1-7B-slerp/blob/main/results_2024-03-27T17-45-19.383720.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495394786201328,\n\
\ \"acc_stderr\": 0.032064181282482354,\n \"acc_norm\": 0.6513332123060737,\n\
\ \"acc_norm_stderr\": 0.03271057829798055,\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.016909693580248825,\n \"mc2\": 0.5453513648146335,\n\
\ \"mc2_stderr\": 0.0150981854258586\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.01407722310847014,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.661521609241187,\n\
\ \"acc_stderr\": 0.004722250355106684,\n \"acc_norm\": 0.8599880501892053,\n\
\ \"acc_norm_stderr\": 0.003462902601136198\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474884,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009245,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032199,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032199\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174927,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174927\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.016909693580248825,\n \"mc2\": 0.5453513648146335,\n\
\ \"mc2_stderr\": 0.0150981854258586\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019813\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6178923426838514,\n \
\ \"acc_stderr\": 0.013384173935648492\n }\n}\n```"
repo_url: https://huggingface.co/weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-45-19.383720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-45-19.383720.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- '**/details_harness|winogrande|5_2024-03-27T17-45-19.383720.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T17-45-19.383720.parquet'
- config_name: results
data_files:
- split: 2024_03_27T17_45_19.383720
path:
- results_2024-03-27T17-45-19.383720.parquet
- split: latest
path:
- results_2024-03-27T17-45-19.383720.parquet
---
# Dataset Card for Evaluation run of weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp](https://huggingface.co/weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_weezywitasneezy__OxytocinErosEngineeringF1-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T17:45:19.383720](https://huggingface.co/datasets/open-llm-leaderboard/details_weezywitasneezy__OxytocinErosEngineeringF1-7B-slerp/blob/main/results_2024-03-27T17-45-19.383720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495394786201328,
"acc_stderr": 0.032064181282482354,
"acc_norm": 0.6513332123060737,
"acc_norm_stderr": 0.03271057829798055,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248825,
"mc2": 0.5453513648146335,
"mc2_stderr": 0.0150981854258586
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.01407722310847014,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.661521609241187,
"acc_stderr": 0.004722250355106684,
"acc_norm": 0.8599880501892053,
"acc_norm_stderr": 0.003462902601136198
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474884,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009245,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032199,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032199
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174927,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174927
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248825,
"mc2": 0.5453513648146335,
"mc2_stderr": 0.0150981854258586
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019813
},
"harness|gsm8k|5": {
"acc": 0.6178923426838514,
"acc_stderr": 0.013384173935648492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ndavidson/cisco_inam_chatml | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Text
dtype: string
- name: prompt_and_answer
dtype: string
splits:
- name: train
num_bytes: 66351087
num_examples: 33170
download_size: 18365975
dataset_size: 66351087
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edbeeching/prj_gia_dataset_atari_2B_atari_jamesbond_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_jamesbond environment, sample for the policy atari_2B_atari_jamesbond_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
Norod78/bender-blip2-captions-512 | ---
language: en
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 64037080.0
num_examples: 260
download_size: 63813431
dataset_size: 64037080.0
---
# Dataset Card for "bender-blip2-captions-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
annahii/anahiteste | ---
license: openrail
---
|
KyS/OCR_VnReceipt | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 57394505.5
num_examples: 6228
download_size: 57083510
dataset_size: 57394505.5
---
# Dataset Card for "OCR_VnReceipt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexusflow/Function_Call_Definitions | ---
license: cc-by-nc-sa-4.0
dataset_info:
- config_name: CVECPE
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 8237
num_examples: 2
download_size: 13384
dataset_size: 8237
- config_name: CVECPE_Multi (Nested)
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 17425
num_examples: 20
download_size: 15503
dataset_size: 17425
- config_name: Climate
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 2905
num_examples: 8
download_size: 4163
dataset_size: 2905
- config_name: OTX
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 7040
num_examples: 9
download_size: 8407
dataset_size: 7040
- config_name: Places
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 2460
num_examples: 7
download_size: 5759
dataset_size: 2460
- config_name: VT_Multi (Nested)
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 18137
num_examples: 29
download_size: 13810
dataset_size: 18137
- config_name: VT_Multi (Parallel)
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 18137
num_examples: 29
download_size: 13810
dataset_size: 18137
- config_name: VirusTotal
features:
- name: function_calls
dtype: string
- name: descriptions
dtype: string
splits:
- name: train
num_bytes: 11501
num_examples: 12
download_size: 11668
dataset_size: 11501
configs:
- config_name: CVECPE
data_files:
- split: train
path: CVECPE/train-*
- config_name: CVECPE_Multi (Nested)
data_files:
- split: train
path: CVECPE_Multi (Nested)/train-*
- config_name: Climate
data_files:
- split: train
path: Climate/train-*
- config_name: OTX
data_files:
- split: train
path: OTX/train-*
- config_name: Places
data_files:
- split: train
path: Places/train-*
- config_name: VT_Multi (Nested)
data_files:
- split: train
path: VT_Multi (Nested)/train-*
- config_name: VT_Multi (Parallel)
data_files:
- split: train
path: VT_Multi (Parallel)/train-*
- config_name: VirusTotal
data_files:
- split: train
path: VirusTotal/train-*
---
# Dataset Card for "Nexusflow/Function_Call_Definitions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/bab683f5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1342
dataset_size: 182
---
# Dataset Card for "bab683f5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Birchlabs/openai-prm800k-phase2_test-stepwise-best | ---
license: mit
---
|
ms_marco | ---
language:
- en
paperswithcode_id: ms-marco
pretty_name: Microsoft Machine Reading Comprehension Dataset
dataset_info:
- config_name: v1.1
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
splits:
- name: validation
num_bytes: 42665198
num_examples: 10047
- name: train
num_bytes: 350516260
num_examples: 82326
- name: test
num_bytes: 40977580
num_examples: 9650
download_size: 217328153
dataset_size: 434159038
- config_name: v2.1
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
splits:
- name: validation
num_bytes: 413765365
num_examples: 101093
- name: train
num_bytes: 3462807709
num_examples: 808731
- name: test
num_bytes: 405691932
num_examples: 101092
download_size: 2105722550
dataset_size: 4282265006
configs:
- config_name: v1.1
data_files:
- split: validation
path: v1.1/validation-*
- split: train
path: v1.1/train-*
- split: test
path: v1.1/test-*
- config_name: v2.1
data_files:
- split: validation
path: v2.1/validation-*
- split: train
path: v2.1/train-*
- split: test
path: v2.1/test-*
---
# Dataset Card for "ms_marco"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://microsoft.github.io/msmarco/](https://microsoft.github.io/msmarco/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.55 GB
- **Size of the generated dataset:** 4.72 GB
- **Total amount of disk used:** 6.28 GB
### Dataset Summary
Starting with a paper released at NIPS 2016, MS MARCO is a collection of datasets focused on deep learning in search.
The first dataset was a question answering dataset featuring 100,000 real Bing questions and a human generated answer.
Since then we released a 1,000,000 question dataset, a natural langauge generation dataset, a passage ranking dataset,
keyphrase extraction dataset, crawling dataset, and a conversational search.
There have been 277 submissions. 20 KeyPhrase Extraction submissions, 87 passage ranking submissions, 0 document ranking
submissions, 73 QnA V2 submissions, 82 NLGEN submisions, and 15 QnA V1 submissions
This data comes in three tasks/forms: Original QnA dataset(v1.1), Question Answering(v2.1), Natural Language Generation(v2.1).
The original question answering datset featured 100,000 examples and was released in 2016. Leaderboard is now closed but data is availible below.
The current competitive tasks are Question Answering and Natural Language Generation. Question Answering features over 1,000,000 queries and
is much like the original QnA dataset but bigger and with higher quality. The Natural Language Generation dataset features 180,000 examples and
builds upon the QnA dataset to deliver answers that could be spoken by a smart speaker.
version v1.1
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### v1.1
- **Size of downloaded dataset files:** 168.69 MB
- **Size of the generated dataset:** 434.61 MB
- **Total amount of disk used:** 603.31 MB
An example of 'train' looks as follows.
```
```
#### v2.1
- **Size of downloaded dataset files:** 1.38 GB
- **Size of the generated dataset:** 4.29 GB
- **Total amount of disk used:** 5.67 GB
An example of 'validation' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### v1.1
- `answers`: a `list` of `string` features.
- `passages`: a dictionary feature containing:
- `is_selected`: a `int32` feature.
- `passage_text`: a `string` feature.
- `url`: a `string` feature.
- `query`: a `string` feature.
- `query_id`: a `int32` feature.
- `query_type`: a `string` feature.
- `wellFormedAnswers`: a `list` of `string` features.
#### v2.1
- `answers`: a `list` of `string` features.
- `passages`: a dictionary feature containing:
- `is_selected`: a `int32` feature.
- `passage_text`: a `string` feature.
- `url`: a `string` feature.
- `query`: a `string` feature.
- `query_id`: a `int32` feature.
- `query_type`: a `string` feature.
- `wellFormedAnswers`: a `list` of `string` features.
### Data Splits
|name|train |validation| test |
|----|-----:|---------:|-----:|
|v1.1| 82326| 10047| 9650|
|v2.1|808731| 101093|101092|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{DBLP:journals/corr/NguyenRSGTMD16,
author = {Tri Nguyen and
Mir Rosenberg and
Xia Song and
Jianfeng Gao and
Saurabh Tiwary and
Rangan Majumder and
Li Deng},
title = {{MS} {MARCO:} {A} Human Generated MAchine Reading COmprehension Dataset},
journal = {CoRR},
volume = {abs/1611.09268},
year = {2016},
url = {http://arxiv.org/abs/1611.09268},
archivePrefix = {arXiv},
eprint = {1611.09268},
timestamp = {Mon, 13 Aug 2018 16:49:03 +0200},
biburl = {https://dblp.org/rec/journals/corr/NguyenRSGTMD16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
}
```
### Contributions
Thanks to [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun) for adding this dataset. |
open-llm-leaderboard/details_csujeong__Gemma-7B-Finetuning-JCS-Ko-Ins | ---
pretty_name: Evaluation run of csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins](https://huggingface.co/csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_csujeong__Gemma-7B-Finetuning-JCS-Ko-Ins\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T05:26:10.457342](https://huggingface.co/datasets/open-llm-leaderboard/details_csujeong__Gemma-7B-Finetuning-JCS-Ko-Ins/blob/main/results_2024-03-04T05-26-10.457342.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6602436583411211,\n\
\ \"acc_stderr\": 0.03182322431719899,\n \"acc_norm\": 0.6644956996858645,\n\
\ \"acc_norm_stderr\": 0.0324528094596806,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.48601382240433894,\n\
\ \"mc2_stderr\": 0.014597538953181043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216379,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111726\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.627365066719777,\n\
\ \"acc_stderr\": 0.004825179407757565,\n \"acc_norm\": 0.8278231428002389,\n\
\ \"acc_norm_stderr\": 0.0037676251416117012\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.0314108219759624,\n\
\ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.0314108219759624\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813039,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813039\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n\
\ \"acc_stderr\": 0.025545650426603627,\n \"acc_norm\": 0.8484848484848485,\n\
\ \"acc_norm_stderr\": 0.025545650426603627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3925925925925926,\n \"acc_stderr\": 0.029773847012532967,\n \
\ \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.029773847012532967\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n\
\ \"acc_stderr\": 0.01540508439315707,\n \"acc_norm\": 0.8477064220183487,\n\
\ \"acc_norm_stderr\": 0.01540508439315707\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.023884380925965662,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.023884380925965662\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.02976377940687497,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.02976377940687497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741609,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741609\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388676992,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388676992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n\
\ \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n\
\ \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340866,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179608,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179608\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472655,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472655\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595284,\n \
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595284\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.48601382240433894,\n\
\ \"mc2_stderr\": 0.014597538953181043\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881583\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5193328278999242,\n \
\ \"acc_stderr\": 0.013762185709851353\n }\n}\n```"
repo_url: https://huggingface.co/csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|arc:challenge|25_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|gsm8k|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hellaswag|10_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T05-26-10.457342.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T05-26-10.457342.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- '**/details_harness|winogrande|5_2024-03-04T05-26-10.457342.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T05-26-10.457342.parquet'
- config_name: results
data_files:
- split: 2024_03_04T05_26_10.457342
path:
- results_2024-03-04T05-26-10.457342.parquet
- split: latest
path:
- results_2024-03-04T05-26-10.457342.parquet
---
# Dataset Card for Evaluation run of csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins](https://huggingface.co/csujeong/Gemma-7B-Finetuning-JCS-Ko-Ins) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_csujeong__Gemma-7B-Finetuning-JCS-Ko-Ins",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T05:26:10.457342](https://huggingface.co/datasets/open-llm-leaderboard/details_csujeong__Gemma-7B-Finetuning-JCS-Ko-Ins/blob/main/results_2024-03-04T05-26-10.457342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6602436583411211,
"acc_stderr": 0.03182322431719899,
"acc_norm": 0.6644956996858645,
"acc_norm_stderr": 0.0324528094596806,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.48601382240433894,
"mc2_stderr": 0.014597538953181043
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216379,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111726
},
"harness|hellaswag|10": {
"acc": 0.627365066719777,
"acc_stderr": 0.004825179407757565,
"acc_norm": 0.8278231428002389,
"acc_norm_stderr": 0.0037676251416117012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.0314108219759624,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.0314108219759624
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813039,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813039
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.029773847012532967,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.029773847012532967
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.01540508439315707,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.01540508439315707
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.023884380925965662,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.023884380925965662
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.02976377940687497,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.02976377940687497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741609,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741609
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388676992,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388676992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340866,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179608,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179608
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472655,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.018690850273595284,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.018690850273595284
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.48601382240433894,
"mc2_stderr": 0.014597538953181043
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881583
},
"harness|gsm8k|5": {
"acc": 0.5193328278999242,
"acc_stderr": 0.013762185709851353
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HaiboinLeeds/eee3_small | ---
license: mit
---
|
FabioArdi/yugioh_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: name
dtype: string
- name: frameType
dtype: string
splits:
- name: train
num_bytes: 660338115.545
num_examples: 12405
download_size: 656146541
dataset_size: 660338115.545
---
# Dataset Card for "yugioh_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-deepseek-67b-v15-base](https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T00:18:57.450795](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base/blob/main/results_2023-12-10T00-18-57.450795.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7077078977933982,\n\
\ \"acc_stderr\": 0.030015760444243065,\n \"acc_norm\": 0.7114942838020437,\n\
\ \"acc_norm_stderr\": 0.030600759357365358,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.016874805001453178,\n \"mc2\": 0.5230963516759597,\n\
\ \"mc2_stderr\": 0.014845955802002899\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6751643098984266,\n\
\ \"acc_stderr\": 0.004673563250946104,\n \"acc_norm\": 0.8602867954590719,\n\
\ \"acc_norm_stderr\": 0.003459806991389836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n\
\ \"acc_stderr\": 0.029514245964291762,\n \"acc_norm\": 0.8541666666666666,\n\
\ \"acc_norm_stderr\": 0.029514245964291762\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03855289616378949,\n\
\ \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03855289616378949\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5317460317460317,\n \"acc_stderr\": 0.025699352832131792,\n \"\
acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.025699352832131792\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8290322580645161,\n \"acc_stderr\": 0.02141724293632159,\n \"\
acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.02141724293632159\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424208,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424208\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.02306043838085774,\n \
\ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.02306043838085774\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.029869605095316908,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.029869605095316908\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660076,\n\
\ \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.012809780081878927,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.012809780081878927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580662,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580662\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.01789378490401853,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.01789378490401853\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n\
\ \"acc_stderr\": 0.010770472014886722,\n \"acc_norm\": 0.8991060025542784,\n\
\ \"acc_norm_stderr\": 0.010770472014886722\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.02228963885261789,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.02228963885261789\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210253996,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210253996\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7908496732026143,\n \"acc_stderr\": 0.023287685312334806,\n\
\ \"acc_norm\": 0.7908496732026143,\n \"acc_norm_stderr\": 0.023287685312334806\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8549382716049383,\n \"acc_stderr\": 0.019594877019727956,\n\
\ \"acc_norm\": 0.8549382716049383,\n \"acc_norm_stderr\": 0.019594877019727956\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5632333767926988,\n\
\ \"acc_stderr\": 0.012667701919603657,\n \"acc_norm\": 0.5632333767926988,\n\
\ \"acc_norm_stderr\": 0.012667701919603657\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7794117647058824,\n \"acc_stderr\": 0.016774672365468504,\n \
\ \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.016774672365468504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.016874805001453178,\n \"mc2\": 0.5230963516759597,\n\
\ \"mc2_stderr\": 0.014845955802002899\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222782\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \
\ \"acc_stderr\": 0.013642195352511575\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|arc:challenge|25_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|gsm8k|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hellaswag|10_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T00-18-57.450795.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- '**/details_harness|winogrande|5_2023-12-10T00-18-57.450795.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T00-18-57.450795.parquet'
- config_name: results
data_files:
- split: 2023_12_10T00_18_57.450795
path:
- results_2023-12-10T00-18-57.450795.parquet
- split: latest
path:
- results_2023-12-10T00-18-57.450795.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseek-67b-v15-base](https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T00:18:57.450795](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base/blob/main/results_2023-12-10T00-18-57.450795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7077078977933982,
"acc_stderr": 0.030015760444243065,
"acc_norm": 0.7114942838020437,
"acc_norm_stderr": 0.030600759357365358,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.016874805001453178,
"mc2": 0.5230963516759597,
"mc2_stderr": 0.014845955802002899
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6751643098984266,
"acc_stderr": 0.004673563250946104,
"acc_norm": 0.8602867954590719,
"acc_norm_stderr": 0.003459806991389836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291762,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291762
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632159,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424208,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.02306043838085774,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.02306043838085774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.029869605095316908,
"acc_norm": 0.4,
"acc_norm_stderr": 0.029869605095316908
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660076,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.012809780081878927,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.012809780081878927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580662,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401853,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401853
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.010770472014886722,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.010770472014886722
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.02228963885261789,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.02228963885261789
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210253996,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210253996
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7908496732026143,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.7908496732026143,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8549382716049383,
"acc_stderr": 0.019594877019727956,
"acc_norm": 0.8549382716049383,
"acc_norm_stderr": 0.019594877019727956
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5632333767926988,
"acc_stderr": 0.012667701919603657,
"acc_norm": 0.5632333767926988,
"acc_norm_stderr": 0.012667701919603657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.016774672365468504,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.016774672365468504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.016874805001453178,
"mc2": 0.5230963516759597,
"mc2_stderr": 0.014845955802002899
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222782
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511575
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_T_A_D_PNP_GENERIC_C_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 140791
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 140621
num_examples: 1000
download_size: 105484
dataset_size: 281412
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_T_A_D_PNP_GENERIC_C_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marmofayezi/M3GenLandmarkMuse | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: caption
dtype: string
- name: generated_mask
dtype: image
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 1874182665.75
num_examples: 2998
download_size: 1873715073
dataset_size: 1874182665.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kaluniano12/RYKOJI.2 | ---
license: openrail
---
|
docxster/invoices-v3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: ner_tags
sequence: int64
- name: image_path
dtype: string
splits:
- name: train
num_bytes: 21054455
num_examples: 2792
- name: test
num_bytes: 5325385
num_examples: 698
download_size: 16368518
dataset_size: 26379840
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Ediudo/STEVEM | ---
license: openrail
---
|
heliosprime/twitter_dataset_1712982986 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9002
num_examples: 19
download_size: 8969
dataset_size: 9002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712982986"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Ba2han__Tinypus-1.5B | ---
pretty_name: Evaluation run of Ba2han/Tinypus-1.5B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ba2han/Tinypus-1.5B](https://huggingface.co/Ba2han/Tinypus-1.5B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ba2han__Tinypus-1.5B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T00:12:30.006711](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__Tinypus-1.5B/blob/main/results_2024-01-06T00-12-30.006711.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26030422262398545,\n\
\ \"acc_stderr\": 0.03102662924569781,\n \"acc_norm\": 0.2620638339247892,\n\
\ \"acc_norm_stderr\": 0.031802324390183184,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.39352829284262486,\n\
\ \"mc2_stderr\": 0.013949834959566018\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3054607508532423,\n \"acc_stderr\": 0.013460080478002501,\n\
\ \"acc_norm\": 0.33447098976109213,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43248356901015733,\n\
\ \"acc_stderr\": 0.004944080605048774,\n \"acc_norm\": 0.5734913363871739,\n\
\ \"acc_norm_stderr\": 0.004935587729948866\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.03455473702325438,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.03455473702325438\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.03057944277361034,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.03057944277361034\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747548,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068635,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068635\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392869,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392869\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994106,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390988,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390988\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.02152596540740872,\n \
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.02152596540740872\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n\
\ \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n\
\ \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646035,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646035\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.015745497169049043,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.015745497169049043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225619,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225619\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642976,\n \
\ \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642976\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827061,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827061\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25735294117647056,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.25735294117647056,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265014,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265014\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.39352829284262486,\n\
\ \"mc2_stderr\": 0.013949834959566018\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056472\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \
\ \"acc_stderr\": 0.0027210765770416625\n }\n}\n```"
repo_url: https://huggingface.co/Ba2han/Tinypus-1.5B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|arc:challenge|25_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|gsm8k|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hellaswag|10_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T00-12-30.006711.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T00-12-30.006711.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- '**/details_harness|winogrande|5_2024-01-06T00-12-30.006711.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T00-12-30.006711.parquet'
- config_name: results
data_files:
- split: 2024_01_06T00_12_30.006711
path:
- results_2024-01-06T00-12-30.006711.parquet
- split: latest
path:
- results_2024-01-06T00-12-30.006711.parquet
---
# Dataset Card for Evaluation run of Ba2han/Tinypus-1.5B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Ba2han/Tinypus-1.5B](https://huggingface.co/Ba2han/Tinypus-1.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ba2han__Tinypus-1.5B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T00:12:30.006711](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__Tinypus-1.5B/blob/main/results_2024-01-06T00-12-30.006711.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26030422262398545,
"acc_stderr": 0.03102662924569781,
"acc_norm": 0.2620638339247892,
"acc_norm_stderr": 0.031802324390183184,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.39352829284262486,
"mc2_stderr": 0.013949834959566018
},
"harness|arc:challenge|25": {
"acc": 0.3054607508532423,
"acc_stderr": 0.013460080478002501,
"acc_norm": 0.33447098976109213,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.43248356901015733,
"acc_stderr": 0.004944080605048774,
"acc_norm": 0.5734913363871739,
"acc_norm_stderr": 0.004935587729948866
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325438,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325438
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173043,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173043
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.03057944277361034,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.03057944277361034
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747548,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068635,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068635
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392869,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392869
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994106,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.02152596540740872,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.02152596540740872
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.0372767357559692,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.0372767357559692
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646035,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646035
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049043,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225619,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225619
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642976,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642976
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827061,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827061
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25735294117647056,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.25735294117647056,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265014,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265014
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.39352829284262486,
"mc2_stderr": 0.013949834959566018
},
"harness|winogrande|5": {
"acc": 0.5769534333070244,
"acc_stderr": 0.013885055359056472
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416625
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ndavidson/QA_with_context | ---
dataset_info:
features:
- name: question
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 876334767
num_examples: 376066
download_size: 450174406
dataset_size: 876334767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TenzinGayche/AlpacaCleanedGPT4 | ---
license: apache-2.0
---
https://github.com/gururise/AlpacaDataCleaned |
hgarg/opengoody | ---
license: apache-2.0
---
|
Arris/ritvik-iyer-state-of-the-union-faiss | ---
license: mit
language:
- en
tags:
- faiss
- sotu
size_categories:
- n<1K
--- |
sirCamp/news_as2_pairs_and_triplets | ---
dataset_info:
features:
- name: texts
sequence: string
splits:
- name: pairs
num_bytes: 25762157
num_examples: 127663
- name: triplets
num_bytes: 82621587
num_examples: 254649
download_size: 46633612
dataset_size: 108383744
---
# Dataset Card for "news_as2_pairs_and_triplets"
This dataset is a modified version of NewsAS2 from [lucadiliello/news_as2](https://huggingface.co/datasets/lucadiliello/news_as2) designed to train/pre-train "passage-retrieval" models.
The dataset is divided into two splits:
+ *pairs* split: Each instance in this split consists of a <u>question-(positive)answer pair</u>.
+ *triplets* split: Each instance in this split consists of a <u>question-(positive)answer-(hard-negative)answer triplet</u>.
For more info check out the original NewsQA dataset in this [repository](https://github.com/lucadiliello/answer-selection) |
irds/tripclick_train_tail | ---
pretty_name: '`tripclick/train/tail`'
viewer: false
source_datasets: ['irds/tripclick']
task_categories:
- text-retrieval
---
# Dataset Card for `tripclick/train/tail`
The `tripclick/train/tail` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/tripclick#tripclick/train/tail).
# Data
This dataset provides:
- `queries` (i.e., topics); count=576,156
- `qrels`: (relevance assessments); count=1,621,493
- For `docs`, use [`irds/tripclick`](https://huggingface.co/datasets/irds/tripclick)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/tripclick_train_tail', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/tripclick_train_tail', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Rekabsaz2021TripClick,
title={TripClick: The Log Files of a Large Health Web Search Engine},
author={Navid Rekabsaz and Oleg Lesota and Markus Schedl and Jon Brassey and Carsten Eickhoff},
year={2021},
booktitle={SIGIR}
}
```
|
carlosdanielhernandezmena/prueba | ---
license: cc-by-4.0
---
|
khelton/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Jackie_Robinson
'1': Karen_Gillan
'2': Ken_Griffey_Jr
'3': Michelle_Yeoh
'4': Mike_Trout
'5': Ralph_Macchio
'6': Steve_Lemme
splits:
- name: train
num_bytes: 7874324.0
num_examples: 27
download_size: 7877307
dataset_size: 7874324.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yjhuang01/Hokchia | ---
language:
- zh
license:
- mit
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- automatic-speech-recognition
---
# Hokchia Audio Dataset
Hokchia, or the Fuqing dialect, is a branch of Eastern Min Chinese spoken mainly in the Fuqing City of Fujian province, China. Unlike Hokkien, which is more widely recognized and spoken in various parts of Southeast Asia, Hokchia maintains its unique linguistic characteristics and is primarily used within the Fuqing community and its diaspora. This dialect is known for its distinct pronunciation, vocabulary, and grammatical structures compared to other Min Chinese varieties. The Hokchia audio dataset is designed to foster speech recognition technologies that cater to this unique dialect, aiming to enhance digital inclusivity for Hokchia speakers worldwide.
## Dataset Description
The Hokchia Audio Dataset is a collection of audio recordings in the Hokchia language, accompanied by transcriptions. It is intended for use in speech recognition models, particularly to fine-tune models like Whisper for the Hokchia language. The dataset includes a wide range of spoken content, making it suitable for various applications requiring speech-to-text capabilities in Hokchia.
## Content
Each audio file in the dataset is named following the pattern `Hokchia_X.wav`, where `X` is a numerical identifier. Accompanying each audio file is a JSON line in the `whisper_finetune_input.jsonl` file, providing the text transcription of the audio content. The dataset structure is as follows:
- `README.md`: This file.
- `dataset/`: Directory containing audio files split into subdirectories by language.
- `Hokchia/`: Subdirectory containing Hokchia audio files.
- `whisper_finetune_input.jsonl`: JSON Lines file containing mappings of audio file paths to their text transcriptions.
## Dataset Structure
Each line in the `whisper_finetune_input.jsonl` file represents a data point in the following format:
```json
{"audio_filepath": "./dataset/Hokchia/Hokchia_1.wav", "text": "text transcription here"}
```
audio_filepath: Relative path to the audio file.
text: Transcription of the audio in Hokchia.
Use Cases
This dataset is particularly suited for:
Training and fine-tuning speech recognition models on the Hokchia language.
Linguistic studies focusing on the Hokchia dialect.
Developing voice-activated applications that require understanding of Hokchia.
How to Use
You can load this dataset using the Hugging Face datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("AnnoFichel/hokchia_audio_dataset")
```
Acknowledgements
This dataset was collected and prepared by Jack Huang. We acknowledge the contributions of the speakers who participated in the recording sessions and the individuals who provided transcriptions. |
BangumiBase/chainsawman | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Chainsaw Man
This is the image base of bangumi Chainsaw Man, we detected 45 characters, 4553 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1097 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 17 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 11 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 14 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 25 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 10 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 95 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 28 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 303 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 601 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 415 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 42 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 48 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 41 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 43 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 136 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 37 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 16 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 20 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 51 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 13 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 57 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 12 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 11 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 15 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 13 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 90 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 26 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 72 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 305 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 426 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 22 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 47 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 12 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 23 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 20 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 29 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 15 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 85 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 12 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 12 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 25 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 15 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 123 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
CyberHarem/oikawa_shizuku_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of oikawa_shizuku/及川雫/오이카와시즈쿠 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of oikawa_shizuku/及川雫/오이카와시즈쿠 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `short_hair, brown_hair, breasts, brown_eyes, large_breasts, huge_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 494.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oikawa_shizuku_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 334.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oikawa_shizuku_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1161 | 670.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oikawa_shizuku_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 460.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oikawa_shizuku_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1161 | 859.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oikawa_shizuku_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oikawa_shizuku_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, collar, cow_ears, cow_girl, cow_horns, cow_print, cowbell, elbow_gloves, headset, neck_bell, open_mouth, pink_thighhighs, solo, blush, cleavage, cow_tail, pink_gloves, skirt, smile, navel |
| 1 | 5 |  |  |  |  |  | 1girl, blush, collar, cow_ears, cow_girl, cow_horns, cow_print, cowbell, elbow_gloves, headset, neck_bell, nipples, open_mouth, pink_thighhighs, solo, skirt, cow_tail, lactation, smile |
| 2 | 15 |  |  |  |  |  | 1girl, collar, cow_ears, cow_girl, cow_horns, cow_print, cowbell, elbow_gloves, neck_bell, solo, headset, cow_tail, blush, open_mouth, smile, cleavage, navel, looking_at_viewer, simple_background, skirt, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, blush, collar, cow_ears, cow_girl, cow_horns, cow_print, cowbell, elbow_gloves, neck_bell, headset, open_mouth, solo, lactation, smile, nipples |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, collar, cow_ears, cow_girl, cow_horns, cow_print, cow_tail, cowbell, elbow_gloves, hetero, neck_bell, paizuri, penis, solo_focus, breasts_squeezed_together, headset, mosaic_censoring, nipples, open_mouth, ejaculation, :d, cum_on_breasts, pink_gloves, tongue |
| 5 | 5 |  |  |  |  |  | 1girl, blush, collar, cow_ears, cow_girl, cow_horns, cow_print, elbow_gloves, hetero, multiple_penises, neck_bell, nipples, solo_focus, cowbell, double_handjob, gloved_handjob, lactation, open_mouth, bar_censor, cum_on_breasts, facial, gangbang, headset, paizuri, pink_gloves, 2boys, 3boys, 4boys, 5boys, bukkake, cum_on_hair, cum_on_tongue, ejaculation, fellatio, gigantic_breasts, mmf_threesome, pointless_censoring, veins |
| 6 | 11 |  |  |  |  |  | looking_at_viewer, 1girl, blush, cleavage, collarbone, navel, open_mouth, solo, bangs, simple_background, white_background, bare_shoulders, :d, side-tie_bikini_bottom, striped_bikini, white_bikini |
| 7 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, solo, white_background, blush, open_mouth, :d |
| 8 | 13 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, nipples, paizuri, cum_on_breasts, penis, smile, open_mouth, breasts_squeezed_together, censored, ejaculation, gigantic_breasts |
| 9 | 9 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, open_mouth, solo, black_gloves, blush, looking_at_viewer, bangs, hairband, smile, belt, bodysuit, simple_background, zipper |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collar | cow_ears | cow_girl | cow_horns | cow_print | cowbell | elbow_gloves | headset | neck_bell | open_mouth | pink_thighhighs | solo | blush | cleavage | cow_tail | pink_gloves | skirt | smile | navel | nipples | lactation | looking_at_viewer | simple_background | white_background | 1boy | hetero | paizuri | penis | solo_focus | breasts_squeezed_together | mosaic_censoring | ejaculation | :d | cum_on_breasts | tongue | multiple_penises | double_handjob | gloved_handjob | bar_censor | facial | gangbang | 2boys | 3boys | 4boys | 5boys | bukkake | cum_on_hair | cum_on_tongue | fellatio | gigantic_breasts | mmf_threesome | pointless_censoring | veins | collarbone | bangs | bare_shoulders | side-tie_bikini_bottom | striped_bikini | white_bikini | censored | black_gloves | hairband | belt | bodysuit | zipper |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-----------|:-----------|:------------|:------------|:----------|:---------------|:----------|:------------|:-------------|:------------------|:-------|:--------|:-----------|:-----------|:--------------|:--------|:--------|:--------|:----------|:------------|:--------------------|:--------------------|:-------------------|:-------|:---------|:----------|:--------|:-------------|:----------------------------|:-------------------|:--------------|:-----|:-----------------|:---------|:-------------------|:-----------------|:-----------------|:-------------|:---------|:-----------|:--------|:--------|:--------|:--------|:----------|:--------------|:----------------|:-----------|:-------------------|:----------------|:----------------------|:--------|:-------------|:--------|:-----------------|:-------------------------|:-----------------|:---------------|:-----------|:---------------|:-----------|:-------|:-----------|:---------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | X | | X | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | X | | | X | | | | X | X | | | | | X | X | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | | | | | | | | | X | | X | X | X | | | | | X | | | X | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | | | | | | | | X | | X | X | | | | | | | | | X | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 13 |  |  |  |  |  | X | | | | | | | | | | X | | | X | | | | | X | | X | | | | | X | X | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | |
| 9 | 9 |  |  |  |  |  | X | | | | | | | X | | | X | | X | X | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | X | X | X |
|
liuyanchen1015/VALUE_mrpc_null_genetive | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 120404
num_examples: 436
- name: train
num_bytes: 262071
num_examples: 933
- name: validation
num_bytes: 34673
num_examples: 122
download_size: 277759
dataset_size: 417148
---
# Dataset Card for "VALUE_mrpc_null_genetive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lansinuote/ocr_id_card_small | ---
dataset_info:
features:
- name: image
dtype: image
- name: ocr
list:
- name: box
sequence: float64
- name: cls
dtype: int64
- name: word
dtype: string
splits:
- name: train
num_bytes: 539900555.5
num_examples: 8500
download_size: 539277160
dataset_size: 539900555.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
easytpp/amazon | ---
license: apache-2.0
---
|
AnnasBlackHat/alpaca-indonesia-llama | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 27355550
num_examples: 39440
- name: test
num_bytes: 6995124
num_examples: 9860
download_size: 17306368
dataset_size: 34350674
---
# Dataset Card for "alpaca-indonesia-llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kinit-tomassako/ver_claimdetection_demo2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
heliosprime/twitter_dataset_1713018257 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9496
num_examples: 22
download_size: 9140
dataset_size: 9496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713018257"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
knat/TDTURegulation | ---
language:
- vi
pretty_name: TDTU Regulatio
--- |
julius-br/GARFAB | ---
license: mit
---
**Published**: September 21th, 2022 <br>
**Author**: Julius Breiholz
# GARFAB-Dataset
The (G)erman corpus of annotated (A)pp (R)eviews to detect (F)eature requests (A)nd (B)ug reports (GARFAB) is a dataset to fine-tune models regarding classification of app reviews (ASRs) into "Feature Requests", "Bug Reports" and "Irrelevants" for the German language. All ASRs were collected from the Google Play Store and were classified manually by two independent annotators. A weighted and a full version are published with the following distributions of ASRs:
| | Feature Request | Bug Reports | Irrelevant | Total |
| --- | --- | --- | --- | --- |
full | 345 | 387 | 2212 | 2944 |
weighted | 345 | 345 | 345 | 1035 |
|
jganzabalseenka/mlsum-spanish-truncated-512 | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: decoder_input_ids
sequence: int64
- name: decoder_attention_mask
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2720064586
num_examples: 266367
- name: validation
num_bytes: 109186816
num_examples: 10358
- name: test
num_bytes: 149939505
num_examples: 13920
download_size: 1157591855
dataset_size: 2979190907
---
# Dataset Card for "mlsum-spanish-truncated-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb-pt/emotion | ---
configs:
- config_name: pt-br
data_files:
- split: train
path: train_df*
- split: validation
path: validation_df*
- split: test
path: test_df*
--- |
fun1021183/cvt1_GS3_test_f | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2139623378.875
num_examples: 13257
- name: test
num_bytes: 745774671.875
num_examples: 4721
download_size: 2721265703
dataset_size: 2885398050.75
---
# Dataset Card for "cvt1_GS3_test_f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
COLD-team/COLD | ---
license: cc-by-sa-4.0
---
## COLD: Complex Offensive Language Dataset
If you use this dataset, please cite the following paper (BibTex below):
Alexis Palmer, Christine Carr, Melissa Robinson, and Jordan Sanders. 2020 (to appear). COLD: Annotation scheme and evaluation data set for complex offensive language in English. *Journal of Linguistics and Computational Linguistics*.
## Overview of data
The COLD data set is intended for researchers to diagnose and assess their automatic hate speech detection systems. The corpus highlights 4 different types of complex offensive language: slurs, reclaimed slurs, adjective nominalization, distancing, and also non-offensive texts. The corpus contains a set of tweets collected from 3 different data sets: Davidson et al (2017), Waseem and Hovy (2016), and Robinson (2017). The data is annotated by 6 annotators, with each instance being annotated by at least 3 different annotators.
**COLD-2016** is the data set used for the analyses and experimental results described in the JLCL paper. This version of the data set contains 2016 instances, selected using filters aiming to capture the complex offensive language types listed above.
## Format and annotations
The data are made available here as .tsv files. The format consists of eight columns: four informational and four annotation-related.
### Informational columns:
1. **ID** - information about the original data set and the textual instance's ID from the data set it was extracted from. The ID includes a letter indicating which data set it originates from, followed by a hyphen and the corresponding ID of the instance in the original data set. For example: D-63 means that the instance is from the Davidson et al. (2017) data set, originally with the ID number 63.
2. **Dataset** - a letter indicating from which dataset this instance originates.
3. **Text** - the text of the instance.
### Majority Vote Columns:
For each instance, annotators were asked to answer Yes or No to each of four questions. Theses columns are the majority vote from three annotators (See the paper for much more detailed discussion, as well as distributions, etc.)
1. **Off** Is this text offensive?
2. **Slur** Is there a slur in the text?
3. **Nom** Is there an adjectival nominalization in the text?
4. **Dist** Is there (linguistic) distancing in the text?
### Individual Annotator Columns:
For each instance, annotators were asked to answer Yes or No to each of four questions. Theses columns are the individual response from each annotators (See the paper for much more detailed discussion, as well as distributions, etc.)
1. **Off1/2/3** Is this text offensive?
2. **Slur1/2/3** Is there a slur in the text?
3. **Nom1/2/3** Is there an adjectival nominalization in the text?
4. **Dist1/2/3** Is there (linguistic) distancing in the text?
### Category
1. **Cat** This column is deduced based on the majority votes for OFF/SLUR/NOM/DIST. (See the paper for detailed explination the categories, as well as distributions, etc.)
## Contact
If you have any questions please contact carrc9953@gmail.com, alexis.palmer@unt.edu, or melissa.robinson@my.unt.edu.
## BibTex
```
@article{cold:2020,
title = {COLD: Annotation scheme and evaluation data set for complex offensive language in English},
author = {Palmer, Alexis and Carr, Christine and Robinson, Melissa and Sanders, Jordan},
journal = {Journal of Linguistics and Computational Linguistics, Special Issue},
year = {2020},
volume={to appear},
number={to appear},
pages = {tbd}
}
```
## References
Davidson, T., Wamsley, D., Macy, M., & Weber, I. (2017). Automated hate speech detection and
the problem of offensive language. In Eleventh international conference on web and
social media. <a href="https://aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15665">[the paper]</a>, <a href="https://github.com/t-davidson/hate-speech-and-offensive-language">[the repository]</a>
Robinson, M. (2018). A man needs a female like a fish needs a lobotomy: The role of adjectival
nominalization in pejorative meaning. Master's thesis, Department of Linguistics, University of North Texas.
<a href="https://digital.library.unt.edu/ark:/67531/metadc1157617/m2/1/high_res_d/ROBINSON-THESIS-2018.pdf">[the thesis]</a>
Waseem, Z., & Hovy, D. (2016). Hateful Symbols or Hateful People? Predictive Features for
Hate Speech Detection on Twitter. In Proceedings of the NAACL Student Research Workshop. San Diego, California.
<a href="https://www.aclweb.org/anthology/N16-2013/">[the paper]</a> |
Felladrin/pretrain-reddit-instruct | ---
license: mit
source_datasets:
- euclaise/reddit-instruct
---
Conversion of [euclaise/reddit-instruct](https://huggingface.co/datasets/euclaise/reddit-instruct) dataset to be used in pretraining.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
import html
dataset = load_dataset("euclaise/reddit-instruct", split="train")
def format(columns):
return html.unescape(columns["comment_text"].strip())
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_csv("train.csv", index=False)
```
|
Santhosh-V/guanaco-llama2-200 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 201258
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.