datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview | ---
pretty_name: Evaluation run of klosax/open_llama_13b_600bt_preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [klosax/open_llama_13b_600bt_preview](https://huggingface.co/klosax/open_llama_13b_600bt_preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T03:30:54.296590](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview/blob/main/results_2023-10-13T03-30-54.296590.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335577,\n \"f1\": 0.05345952181208074,\n\
\ \"f1_stderr\": 0.0013179175979171436,\n \"acc\": 0.35200275495116307,\n\
\ \"acc_stderr\": 0.00844603886086826\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335577,\n\
\ \"f1\": 0.05345952181208074,\n \"f1_stderr\": 0.0013179175979171436\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \
\ \"acc_stderr\": 0.0038289829787357095\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n\
\ }\n}\n```"
repo_url: https://huggingface.co/klosax/open_llama_13b_600bt_preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|arc:challenge|25_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T03_30_54.296590
path:
- '**/details_harness|drop|3_2023-10-13T03-30-54.296590.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T03-30-54.296590.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T03_30_54.296590
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-30-54.296590.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-30-54.296590.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hellaswag|10_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T13:16:53.504073.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T13:16:53.504073.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T03_30_54.296590
path:
- '**/details_harness|winogrande|5_2023-10-13T03-30-54.296590.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T03-30-54.296590.parquet'
- config_name: results
data_files:
- split: 2023_07_24T13_16_53.504073
path:
- results_2023-07-24T13:16:53.504073.parquet
- split: 2023_10_13T03_30_54.296590
path:
- results_2023-10-13T03-30-54.296590.parquet
- split: latest
path:
- results_2023-10-13T03-30-54.296590.parquet
---
# Dataset Card for Evaluation run of klosax/open_llama_13b_600bt_preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/klosax/open_llama_13b_600bt_preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [klosax/open_llama_13b_600bt_preview](https://huggingface.co/klosax/open_llama_13b_600bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T03:30:54.296590](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview/blob/main/results_2023-10-13T03-30-54.296590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335577,
"f1": 0.05345952181208074,
"f1_stderr": 0.0013179175979171436,
"acc": 0.35200275495116307,
"acc_stderr": 0.00844603886086826
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335577,
"f1": 0.05345952181208074,
"f1_stderr": 0.0013179175979171436
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357095
},
"harness|winogrande|5": {
"acc": 0.6842936069455406,
"acc_stderr": 0.01306309474300081
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hmzkhnswt/guanaco-llama2-200 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 201258
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/reward-model-no-topic-predictions | ---
dataset_info:
features:
- name: text
dtype: string
- name: lang
dtype: string
- name: lang_score
dtype: float64
- name: topic
dtype: float64
- name: topic_prob
dtype: float64
- name: was_outlier
dtype: float64
- name: comments
list:
- name: prediction
dtype: float64
- name: score
dtype: int64
- name: text
dtype: string
splits:
- name: validation
num_bytes: 24952821
num_examples: 8811
download_size: 15720103
dataset_size: 24952821
---
# Dataset Card for "reward-model-no-topic-predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
egecandrsn/weatherdata | ---
license: unknown
language:
- en
size_categories:
- 1K<n<10K
---
# Weather Dataset README
## Overview
This dataset contains weather data for Ankara, Turkey, from 2016-04-01 to 2022-04-01. The dataset is composed of weather-related measurements and information, such as temperature, precipitation, wind speed, and other relevant parameters.
## Dataset Description
Each row in the dataset represents a single day's weather data. The columns in the dataset are as follows:
- **name** (string): Name of the location (Ankara)
- **datetime** (string): Date in the format "YYYY-MM-DD"
- **tempmax** (float64): Maximum temperature in Celsius
- **tempmin** (float64): Minimum temperature in Celsius
- **temp** (float64): Average temperature in Celsius
- **feelslikemax** (float64): Maximum "feels like" temperature in Celsius
- **feelslikemin** (float64): Minimum "feels like" temperature in Celsius
- **feelslike** (float64): Average "feels like" temperature in Celsius
- **dew** (float64): Dew point temperature in Celsius
- **humidity** (float64): Humidity percentage
- **precip** (float64): Precipitation amount in millimeters
- **precipprob** (int64): Precipitation probability percentage
- **precipcover** (float64): Precipitation coverage percentage
- **preciptype** (null): Precipitation type (should be null in the dataset, otherwise an error)
- **snow** (float64): Snowfall amount in centimeters
- **snowdepth** (float64): Snow depth in centimeters
- **windgust** (float64): Maximum wind gust speed in kilometers per hour
- **windspeed** (float64): Average wind speed in kilometers per hour
- **winddir** (float64): Wind direction in degrees (0-360)
- **sealevelpressure** (float64): Sea-level pressure in millibars
- **cloudcover** (float64): Cloud coverage percentage
- **visibility** (float64): Visibility distance in kilometers
- **solarradiation** (float64): Solar radiation in Watts per square meter
- **solarenergy** (float64): Solar energy in kilojoules per square meter
- **uvindex** (int64): UV index value
- **severerisk** (float64): Severe weather risk percentage
- **sunrise** (string): Sunrise time in the format "YYYY-MM-DDTHH:mm:ss"
- **sunset** (string): Sunset time in the format "YYYY-MM-DDTHH:mm:ss"
- **moonphase** (float64): Moon phase value (0 to 1)
- **conditions** (string): General weather conditions
- **description** (string): Detailed weather description
- **icon** (string): Weather icon identifier
- **stations** (string): Comma-separated list of weather station IDs
## Notes
Please note that there are some errors in the dataset, such as non-null values in the "preciptype" column. Be sure to handle these cases appropriately when processing the data. |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-2fa37c-16136229 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP11
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP11
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
open-llm-leaderboard/details_KnutJaegersberg__deacon-3b | ---
pretty_name: Evaluation run of KnutJaegersberg/deacon-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/deacon-3b](https://huggingface.co/KnutJaegersberg/deacon-3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__deacon-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T13:23:04.115502](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-3b/blob/main/results_2023-10-28T13-23-04.115502.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413187,\n \"f1\": 0.05062919463087265,\n\
\ \"f1_stderr\": 0.0012970020903289405,\n \"acc\": 0.32509979517380905,\n\
\ \"acc_stderr\": 0.007564621001375068\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413187,\n\
\ \"f1\": 0.05062919463087265,\n \"f1_stderr\": 0.0012970020903289405\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501847\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.013436541262599952\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/deacon-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T13_23_04.115502
path:
- '**/details_harness|drop|3_2023-10-28T13-23-04.115502.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T13-23-04.115502.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T13_23_04.115502
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-23-04.115502.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-23-04.115502.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-47-42.541004.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-47-42.541004.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T13_23_04.115502
path:
- '**/details_harness|winogrande|5_2023-10-28T13-23-04.115502.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T13-23-04.115502.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_47_42.541004
path:
- results_2023-09-18T14-47-42.541004.parquet
- split: 2023_10_28T13_23_04.115502
path:
- results_2023-10-28T13-23-04.115502.parquet
- split: latest
path:
- results_2023-10-28T13-23-04.115502.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/deacon-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-3b](https://huggingface.co/KnutJaegersberg/deacon-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__deacon-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:23:04.115502](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-3b/blob/main/results_2023-10-28T13-23-04.115502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413187,
"f1": 0.05062919463087265,
"f1_stderr": 0.0012970020903289405,
"acc": 0.32509979517380905,
"acc_stderr": 0.007564621001375068
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413187,
"f1": 0.05062919463087265,
"f1_stderr": 0.0012970020903289405
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501847
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.013436541262599952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
El-chapoo/Summary-wiki-bio | ---
dataset_info:
features:
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 309758384
num_examples: 582659
- name: test
num_bytes: 38619814
num_examples: 72831
- name: val
num_bytes: 38609985
num_examples: 72831
download_size: 237817185
dataset_size: 386988183
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
AlekseyKorshuk/pickapic_v2-prompts-dedup | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: __cluster__
dtype: int64
splits:
- name: train
num_bytes: 461516230.6301592
num_examples: 2802392
download_size: 212898248
dataset_size: 461516230.6301592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thesid01/dummy-thesid01 | ---
license: apache-2.0
---
|
ucla-contextual/contextual_val | ---
configs:
- config_name: default
data_files:
- split: val
path: "contextual_val.csv"
---
---
license: mit
---
Check out the [paper](https://arxiv.org/abs/2401.13311). |
yzhuang/autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2051534214
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b | ---
pretty_name: Evaluation run of Azure99/blossom-v2-llama2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v2-llama2-7b](https://huggingface.co/Azure99/blossom-v2-llama2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T10:27:00.906454](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b/blob/main/results_2023-10-25T10-27-00.906454.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.35192953020134227,\n\
\ \"em_stderr\": 0.004890785574074548,\n \"f1\": 0.4060790687919473,\n\
\ \"f1_stderr\": 0.004773085782687634,\n \"acc\": 0.3956260064038852,\n\
\ \"acc_stderr\": 0.009074017772494654\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.35192953020134227,\n \"em_stderr\": 0.004890785574074548,\n\
\ \"f1\": 0.4060790687919473,\n \"f1_stderr\": 0.004773085782687634\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \
\ \"acc_stderr\": 0.005874387536229319\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759989\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v2-llama2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T10_27_00.906454
path:
- '**/details_harness|drop|3_2023-10-25T10-27-00.906454.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T10-27-00.906454.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T10_27_00.906454
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-27-00.906454.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-27-00.906454.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-39-22.579303.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-39-22.579303.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T10_27_00.906454
path:
- '**/details_harness|winogrande|5_2023-10-25T10-27-00.906454.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T10-27-00.906454.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_39_22.579303
path:
- results_2023-09-11T17-39-22.579303.parquet
- split: 2023_10_25T10_27_00.906454
path:
- results_2023-10-25T10-27-00.906454.parquet
- split: latest
path:
- results_2023-10-25T10-27-00.906454.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v2-llama2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Azure99/blossom-v2-llama2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Azure99/blossom-v2-llama2-7b](https://huggingface.co/Azure99/blossom-v2-llama2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T10:27:00.906454](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-llama2-7b/blob/main/results_2023-10-25T10-27-00.906454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.35192953020134227,
"em_stderr": 0.004890785574074548,
"f1": 0.4060790687919473,
"f1_stderr": 0.004773085782687634,
"acc": 0.3956260064038852,
"acc_stderr": 0.009074017772494654
},
"harness|drop|3": {
"em": 0.35192953020134227,
"em_stderr": 0.004890785574074548,
"f1": 0.4060790687919473,
"f1_stderr": 0.004773085782687634
},
"harness|gsm8k|5": {
"acc": 0.047763457164518575,
"acc_stderr": 0.005874387536229319
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759989
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nicky0007/titulos_noticias_rcn_clasificadas | ---
task_categories:
- token-classification
language:
- es
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
titulos_noticias_rcn_clasificadas
## Dataset Description
Se tomo las noticias de la pagina de RCN y se clasifico los titulos por ['salud' 'tecnologia' 'colombia' 'economia' 'deportes']
salud= 1805 datos,
tecnologia= 1805 datos,
colombia= 1805 datos,
economia= 1805 datos,
deportes= 1805 datos,
Para dar un total de 9030 filas.
pagina: https://www.noticiasrcn.com/
- **Homepage:**
- **Repository:**
- **Point of Contact:**
### Languages
Español
## Dataset Structure
text, label, url |
klima7/minecraft-segmentation | ---
license: mit
---
|
SEACrowd/bible_jv_id | ---
tags:
- machine-translation
language:
- ind
- jav
---
# bible_jv_id
Analogous to the En ↔ Id and Su ↔ Id datasets, we create a new dataset for Javanese and Indonesian translation generated from the verse-aligned Bible parallel corpus with the same split setting. In terms of size, both the Su ↔ Id and Jv ↔ Id datasets are much smaller compared to the En ↔ Id dataset, because there are Bible chapters for which translations are available for Indonesian, albeit not for the local languages.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{cahyawijaya-etal-2021-indonlg,
title = "{I}ndo{NLG}: Benchmark and Resources for Evaluating {I}ndonesian Natural Language Generation",
author = "Cahyawijaya, Samuel and
Winata, Genta Indra and
Wilie, Bryan and
Vincentio, Karissa and
Li, Xiaohong and
Kuncoro, Adhiguna and
Ruder, Sebastian and
Lim, Zhi Yuan and
Bahar, Syafri and
Khodra, Masayu and
Purwarianti, Ayu and
Fung, Pascale",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.699",
doi = "10.18653/v1/2021.emnlp-main.699",
pages = "8875--8898",
abstract = "Natural language generation (NLG) benchmarks provide an important avenue to measure progress and develop better NLG systems. Unfortunately, the lack of publicly available NLG benchmarks for low-resource languages poses a challenging barrier for building NLG systems that work well for languages with limited amounts of data. Here we introduce IndoNLG, the first benchmark to measure natural language generation (NLG) progress in three low-resource{---}yet widely spoken{---}languages of Indonesia: Indonesian, Javanese, and Sundanese. Altogether, these languages are spoken by more than 100 million native speakers, and hence constitute an important use case of NLG systems today. Concretely, IndoNLG covers six tasks: summarization, question answering, chit-chat, and three different pairs of machine translation (MT) tasks. We collate a clean pretraining corpus of Indonesian, Sundanese, and Javanese datasets, Indo4B-Plus, which is used to pretrain our models: IndoBART and IndoGPT. We show that IndoBART and IndoGPT achieve competitive performance on all tasks{---}despite using only one-fifth the parameters of a larger multilingual model, mBART-large (Liu et al., 2020). This finding emphasizes the importance of pretraining on closely related, localized languages to achieve more efficient learning and faster inference at very low-resource languages like Javanese and Sundanese.",
}
```
## License
Creative Commons Attribution Share-Alike 4.0 International
## Homepage
[https://github.com/IndoNLP/indonlg](https://github.com/IndoNLP/indonlg)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
mHossain/final_train_v4_test_1100000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 7290159.3
num_examples: 18000
- name: test
num_bytes: 810017.7
num_examples: 2000
download_size: 3489433
dataset_size: 8100177.0
---
# Dataset Card for "final_train_v4_test_1100000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SamuelGalaxys/iJupyter | ---
license: apache-2.0
---
# 中国行政区划信息
数据演示地址:[https://passer-by.com/data_location/](https://passer-by.com/data_location/)
三级联动插件:[https://jquerywidget.com/jquery-citys/](https://jquerywidget.com/jquery-citys/)
### 版权
数据库由 [passer-by.com](https://passer-by.com/) 整理,获取最新最全的数据还请关注此项目。
### 数据说明
- 省、市、区数据来自于民政局、国务院公告、国家统计局,确保及时更新和权威;
- 街道(镇、乡)数据由于数据庞大,各地各级之前公函较多,无法保证及时有效(最新数据2022年10月31日);
- 街道(镇、乡)数据文件较多,为兼容旧行政区划代码,采取文件覆盖式更新;
- 数据是以行政区为单位的行政区划数据。行政管理区与行政区存在重合,不予收录;
(行政管理区通常包含:***经济特区/经济开发区/高新区/新区/工业区;亦有部分行政管理区升为行政区,需加以区分)
### 关于行政区划代码
使用《中华人民共和国行政区划代码》国家标准(GB/T2260).
这部分可分为三个层次,从左到右的含义分别是:
- 第一、二位表示省(自治区、直辖市、特别行政区)
- 第三、四位表示市(地区、自治州、盟及国家直辖市所属市辖区和县的汇总码)
- 第五、六位表示县(市辖区、县级市、旗).
#### 代码标准
* [中华人民共和国民政部-中华人民共和国行政区划代码](https://www.mca.gov.cn/mzsj/xzqh/2022/202201xzqh.html)
* [中华人民共和国国家统计局-统计用区划和城乡划分代码](http://www.stats.gov.cn/sj/tjbz/tjyqhdmhcxhfdm/2022/index.html)
* [中华人民共和国国家统计局-统计用区划代码和城乡划分代码编制规则](http://www.stats.gov.cn/sj/tjbz/gjtjbz/202302/t20230213_1902741.html)
港澳台地区编码并非标准编码,而是整理和参考标准编码规则自定义的,方便用户统一使用。
### 反馈
如果有哪些地方数据错误或者更新不及时,还请告知(在"Issues"中留言),以便尽快更新~
|
sayan1101/subject_lines | ---
dataset_info:
config_name: sayan1101
features:
- name: prompt
dtype: string
- name: subject_lines
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 78493229
num_examples: 59489
- name: test
num_bytes: 4030472
num_examples: 3132
download_size: 10833382
dataset_size: 82523701
configs:
- config_name: sayan1101
data_files:
- split: train
path: sayan1101/train-*
- split: test
path: sayan1101/test-*
---
|
Excelsus/Annyv1 | ---
license: openrail
---
|
Gabriel1322/dona1 | ---
license: openrail
---
|
swaroopajit/next-dataset-refined-batch-6000 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 315307268.0
num_examples: 999
download_size: 288501432
dataset_size: 315307268.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "next-dataset-refined-batch-6000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ecoue/nordmann2023 | ---
annotations_creators: []
language:
- de
- en
language_creators: []
license:
- unknown
multilinguality:
- translation
pretty_name: nordmann2023
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- europarl
- newscommentary
- wikititles
- ecb
- rapid
- eesc
- ema
- europat
- books
- ted2020
- qed
- eubookshop
task_categories:
- translation
task_ids: []
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- de
- en
config_name: balanced
splits:
- name: train
num_bytes: 1539472445
num_examples: 5656659
- name: validation
num_bytes: 706611
num_examples: 2754
- name: test
num_bytes: 411077
num_examples: 1831
download_size: 4076594396
dataset_size: 1540590133
---
|
weege007/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966694
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/random_letter_same_length_find_passage_train50_eval20_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 36128
num_examples: 120
- name: validation
num_bytes: 7296
num_examples: 20
download_size: 28032
dataset_size: 43424
---
# Dataset Card for "random_letter_same_length_find_passage_train50_eval20_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
azrai99/coursera-course-dataset | ---
license: apache-2.0
task_categories:
- text2text-generation
size_categories:
- n<1K
language:
- en
---
The intial purpose of this dataset extraction was to extract the relevant skills that can be obtained from each course from **Coursera**.
The skills then can be used for further analytical use.
Feel free to use the dataset at your own use cases. |
open-llm-leaderboard/details_CohereForAI__c4ai-command-r-v01 | ---
pretty_name: Evaluation run of CohereForAI/c4ai-command-r-v01
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CohereForAI/c4ai-command-r-v01](https://huggingface.co/CohereForAI/c4ai-command-r-v01)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CohereForAI__c4ai-command-r-v01\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T04:17:37.786335](https://huggingface.co/datasets/open-llm-leaderboard/details_CohereForAI__c4ai-command-r-v01/blob/main/results_2024-03-22T04-17-37.786335.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6814989868956801,\n\
\ \"acc_stderr\": 0.03111515781319065,\n \"acc_norm\": 0.6846891218527542,\n\
\ \"acc_norm_stderr\": 0.03172957499950617,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5231991183271746,\n\
\ \"mc2_stderr\": 0.015493018963817363\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918763,\n\
\ \"acc_norm\": 0.6552901023890785,\n \"acc_norm_stderr\": 0.01388881628678211\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n\
\ \"acc_stderr\": 0.004676898861978905,\n \"acc_norm\": 0.8700458076080462,\n\
\ \"acc_norm_stderr\": 0.0033556582385714864\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810536,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810536\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7509433962264151,\n \"acc_stderr\": 0.026616482980501704,\n\
\ \"acc_norm\": 0.7509433962264151,\n \"acc_norm_stderr\": 0.026616482980501704\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493855,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493855\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538787,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538787\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313033,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313033\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590179,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590179\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.869198312236287,\n \"acc_stderr\": 0.021948766059470767,\n \
\ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.021948766059470767\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476075,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476075\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.02876748172598386,\n\
\ \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.02876748172598386\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924985,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n\
\ \"acc_stderr\": 0.012426211353093434,\n \"acc_norm\": 0.859514687100894,\n\
\ \"acc_norm_stderr\": 0.012426211353093434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5720670391061452,\n\
\ \"acc_stderr\": 0.016547887997416112,\n \"acc_norm\": 0.5720670391061452,\n\
\ \"acc_norm_stderr\": 0.016547887997416112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5508474576271186,\n\
\ \"acc_stderr\": 0.012704030518851477,\n \"acc_norm\": 0.5508474576271186,\n\
\ \"acc_norm_stderr\": 0.012704030518851477\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663133,\n \
\ \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663133\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.02721283588407316,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.02721283588407316\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5231991183271746,\n\
\ \"mc2_stderr\": 0.015493018963817363\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156883\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \
\ \"acc_stderr\": 0.013650728047064692\n }\n}\n```"
repo_url: https://huggingface.co/CohereForAI/c4ai-command-r-v01
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|arc:challenge|25_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|gsm8k|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hellaswag|10_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T04-17-37.786335.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T04-17-37.786335.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- '**/details_harness|winogrande|5_2024-03-22T04-17-37.786335.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T04-17-37.786335.parquet'
- config_name: results
data_files:
- split: 2024_03_22T04_17_37.786335
path:
- results_2024-03-22T04-17-37.786335.parquet
- split: latest
path:
- results_2024-03-22T04-17-37.786335.parquet
---
# Dataset Card for Evaluation run of CohereForAI/c4ai-command-r-v01
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CohereForAI/c4ai-command-r-v01](https://huggingface.co/CohereForAI/c4ai-command-r-v01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CohereForAI__c4ai-command-r-v01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T04:17:37.786335](https://huggingface.co/datasets/open-llm-leaderboard/details_CohereForAI__c4ai-command-r-v01/blob/main/results_2024-03-22T04-17-37.786335.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6814989868956801,
"acc_stderr": 0.03111515781319065,
"acc_norm": 0.6846891218527542,
"acc_norm_stderr": 0.03172957499950617,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5231991183271746,
"mc2_stderr": 0.015493018963817363
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.013990571137918763,
"acc_norm": 0.6552901023890785,
"acc_norm_stderr": 0.01388881628678211
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.004676898861978905,
"acc_norm": 0.8700458076080462,
"acc_norm_stderr": 0.0033556582385714864
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7509433962264151,
"acc_stderr": 0.026616482980501704,
"acc_norm": 0.7509433962264151,
"acc_norm_stderr": 0.026616482980501704
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.025690321762493855,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.025690321762493855
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538787,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538787
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313033,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313033
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590179,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590179
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.021948766059470767,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.021948766059470767
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476075,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476075
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.02876748172598386,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.02876748172598386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924985,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.859514687100894,
"acc_stderr": 0.012426211353093434,
"acc_norm": 0.859514687100894,
"acc_norm_stderr": 0.012426211353093434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5720670391061452,
"acc_stderr": 0.016547887997416112,
"acc_norm": 0.5720670391061452,
"acc_norm_stderr": 0.016547887997416112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7491961414790996,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.7491961414790996,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5508474576271186,
"acc_stderr": 0.012704030518851477,
"acc_norm": 0.5508474576271186,
"acc_norm_stderr": 0.012704030518851477
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663133,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663133
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.02721283588407316,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.02721283588407316
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5231991183271746,
"mc2_stderr": 0.015493018963817363
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156883
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064692
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MMStestlab/testdataset | ---
license: apache-2.0
---
|
huggingartists/post-malone | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/post-malone"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.572271 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/1010194fa644be099aa2d1329de0b230.448x448x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/post-malone">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Post Malone</div>
<a href="https://genius.com/artists/post-malone">
<div style="text-align: center; font-size: 14px;">@post-malone</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/post-malone).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/post-malone")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|295| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/post-malone")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
sudarsa/tts_m_ds | ---
license: apache-2.0
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 101904153.0
num_examples: 37
download_size: 88655284
dataset_size: 101904153.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HideOnBush/try_from_load_script | ---
license: apache-2.0
dataset_info:
features:
- name: sample_id
dtype: string
- name: image
dtype: image
- name: ocr_bboxes
sequence:
sequence: float64
- name: ocr_polygons
sequence:
sequence: float64
- name: ocr_predictions
sequence: string
splits:
- name: train
num_bytes: 3704492.0
num_examples: 20
- name: val
num_bytes: 3847574.0
num_examples: 20
- name: concat
num_bytes: 3784892.0
num_examples: 20
download_size: 11071052
dataset_size: 11336958.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: concat
path: data/concat-*
---
|
mrSoul7766/hindi_to_odia_translation_10k | ---
language:
- hi
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 8648292
num_examples: 10951
download_size: 2080403
dataset_size: 8648292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tobydrew/Omg | ---
license: openrail
---
|
hung577/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rodrigomokipo/bowie2 | ---
license: openrail
---
|
open-llm-leaderboard/details_l3utterfly__phi-2-layla-v1-chatml | ---
pretty_name: Evaluation run of l3utterfly/phi-2-layla-v1-chatml
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [l3utterfly/phi-2-layla-v1-chatml](https://huggingface.co/l3utterfly/phi-2-layla-v1-chatml)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__phi-2-layla-v1-chatml\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T09:16:31.575195](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__phi-2-layla-v1-chatml/blob/main/results_2024-03-11T09-16-31.575195.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5689793192343258,\n\
\ \"acc_stderr\": 0.03378530184123675,\n \"acc_norm\": 0.5699249209739704,\n\
\ \"acc_norm_stderr\": 0.03447730573554182,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4421012271211113,\n\
\ \"mc2_stderr\": 0.015139955430833385\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.01429122839353659\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5657239593706433,\n\
\ \"acc_stderr\": 0.0049464854665446254,\n \"acc_norm\": 0.7457677753435571,\n\
\ \"acc_norm_stderr\": 0.00434538861452002\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.026795560848122794,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.026795560848122794\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481912,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481912\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700293,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700293\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868575,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868575\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256483,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256483\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033086111132364364,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033086111132364364\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165538,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165538\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103986,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103986\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631452,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.02708540122613214,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.02708540122613214\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811943,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.02014297455379521,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.02014297455379521\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4421012271211113,\n\
\ \"mc2_stderr\": 0.015139955430833385\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.0122859896188657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5451099317664898,\n \
\ \"acc_stderr\": 0.013716318771794602\n }\n}\n```"
repo_url: https://huggingface.co/l3utterfly/phi-2-layla-v1-chatml
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|arc:challenge|25_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|gsm8k|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hellaswag|10_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T09-16-31.575195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T09-16-31.575195.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- '**/details_harness|winogrande|5_2024-03-11T09-16-31.575195.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T09-16-31.575195.parquet'
- config_name: results
data_files:
- split: 2024_03_11T09_16_31.575195
path:
- results_2024-03-11T09-16-31.575195.parquet
- split: latest
path:
- results_2024-03-11T09-16-31.575195.parquet
---
# Dataset Card for Evaluation run of l3utterfly/phi-2-layla-v1-chatml
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [l3utterfly/phi-2-layla-v1-chatml](https://huggingface.co/l3utterfly/phi-2-layla-v1-chatml) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__phi-2-layla-v1-chatml",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T09:16:31.575195](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__phi-2-layla-v1-chatml/blob/main/results_2024-03-11T09-16-31.575195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5689793192343258,
"acc_stderr": 0.03378530184123675,
"acc_norm": 0.5699249209739704,
"acc_norm_stderr": 0.03447730573554182,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.4421012271211113,
"mc2_stderr": 0.015139955430833385
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.01429122839353659
},
"harness|hellaswag|10": {
"acc": 0.5657239593706433,
"acc_stderr": 0.0049464854665446254,
"acc_norm": 0.7457677753435571,
"acc_norm_stderr": 0.00434538861452002
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122794,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122794
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481912,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481912
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700293,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700293
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868575,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868575
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256483,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033086111132364364,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033086111132364364
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165538,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103986,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103986
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631452,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.02708540122613214,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.02708540122613214
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811943,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.02014297455379521,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.02014297455379521
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.4421012271211113,
"mc2_stderr": 0.015139955430833385
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.0122859896188657
},
"harness|gsm8k|5": {
"acc": 0.5451099317664898,
"acc_stderr": 0.013716318771794602
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B | ---
pretty_name: Evaluation run of NousResearch/Nous-Capybara-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/Nous-Capybara-7B](https://huggingface.co/NousResearch/Nous-Capybara-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T10:04:06.657225](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B/blob/main/results_2023-10-26T10-04-06.657225.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.27485318791946306,\n\
\ \"em_stderr\": 0.004571966205905316,\n \"f1\": 0.33436241610738393,\n\
\ \"f1_stderr\": 0.004533650800277715,\n \"acc\": 0.40148895416572666,\n\
\ \"acc_stderr\": 0.009723217836579088\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.27485318791946306,\n \"em_stderr\": 0.004571966205905316,\n\
\ \"f1\": 0.33436241610738393,\n \"f1_stderr\": 0.004533650800277715\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \
\ \"acc_stderr\": 0.007016389571013846\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.01243004610214433\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NousResearch/Nous-Capybara-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T10_04_06.657225
path:
- '**/details_harness|drop|3_2023-10-26T10-04-06.657225.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T10-04-06.657225.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T10_04_06.657225
path:
- '**/details_harness|gsm8k|5_2023-10-26T10-04-06.657225.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T10-04-06.657225.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-27-10.043918.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-27-10.043918.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T10_04_06.657225
path:
- '**/details_harness|winogrande|5_2023-10-26T10-04-06.657225.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T10-04-06.657225.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_27_10.043918
path:
- results_2023-10-03T19-27-10.043918.parquet
- split: 2023_10_26T10_04_06.657225
path:
- results_2023-10-26T10-04-06.657225.parquet
- split: latest
path:
- results_2023-10-26T10-04-06.657225.parquet
---
# Dataset Card for Evaluation run of NousResearch/Nous-Capybara-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Capybara-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Capybara-7B](https://huggingface.co/NousResearch/Nous-Capybara-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T10:04:06.657225](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Capybara-7B/blob/main/results_2023-10-26T10-04-06.657225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.27485318791946306,
"em_stderr": 0.004571966205905316,
"f1": 0.33436241610738393,
"f1_stderr": 0.004533650800277715,
"acc": 0.40148895416572666,
"acc_stderr": 0.009723217836579088
},
"harness|drop|3": {
"em": 0.27485318791946306,
"em_stderr": 0.004571966205905316,
"f1": 0.33436241610738393,
"f1_stderr": 0.004533650800277715
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.007016389571013846
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.01243004610214433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thisisHJLee/1cycle_data_1562 | ---
license: apache-2.0
---
|
jonathanji/gv_dataset_raw | ---
license: openrail
---
|
Tesutato/medical_multiple_choice_qa | ---
configs:
- config_name: main_data
data_files: "dataset.csv"
- config_name: raw_data
data_files: "raw_dataset.csv"
--- |
Nykiz/pixel-images-128x128 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 327075.0
num_examples: 27
download_size: 328265
dataset_size: 327075.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amirshahisage/intacct-enablement-qna | ---
license: apache-2.0
language:
- en
pretty_name: Sage Intacct Knowledgebase Articles
size_categories:
- 1K<n<10K
--- |
zolak/twitter_dataset_80_1713161549 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 474322
num_examples: 1180
download_size: 237490
dataset_size: 474322
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Syed-Hasan-8503/distilabel-capybara-dpo-7k-binarized-rearranged | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: chosen_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_response
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 104125641
num_examples: 7563
download_size: 53642042
dataset_size: 104125641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LahiruLowe/t0_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: system_message
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 1045060
num_examples: 579
download_size: 0
dataset_size: 1045060
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t0_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lighteval/disinformation_wedging | ---
dataset_info:
features:
- name: text
dtype: string
- name: references
sequence: 'null'
- name: none
dtype: string
splits:
- name: validation
num_bytes: 7406
num_examples: 11
download_size: 9583
dataset_size: 7406
---
# Dataset Card for "disinformation_wedging"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_past_for_past_participle | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1534
num_examples: 7
- name: test
num_bytes: 4626
num_examples: 15
- name: train
num_bytes: 11435
num_examples: 50
download_size: 14929
dataset_size: 17595
---
# Dataset Card for "MULTI_VALUE_wnli_past_for_past_participle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20642196.37881698
num_examples: 9398
download_size: 10693213
dataset_size: 20642196.37881698
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ahishamm/isic_binary__enhanced_balanced | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': benign
'1': malignant
splits:
- name: train
num_bytes: 163065541.52
num_examples: 2940
- name: test
num_bytes: 31595914.0
num_examples: 720
download_size: 172630596
dataset_size: 194661455.52
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Lorotanida2/guii | ---
license: openrail
---
|
open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B | ---
pretty_name: Evaluation run of DopeorNope/COKAL-v1-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DopeorNope/COKAL-v1-70B](https://huggingface.co/DopeorNope/COKAL-v1-70B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T10:21:56.669760](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B/blob/main/results_2023-12-10T10-21-56.669760.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6806080675011864,\n\
\ \"acc_stderr\": 0.031026141939535783,\n \"acc_norm\": 0.6871684287339627,\n\
\ \"acc_norm_stderr\": 0.03163298834751675,\n \"mc1\": 0.609547123623011,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.7279131434968619,\n\
\ \"mc2_stderr\": 0.012814436118254086\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.8583617747440273,\n \"acc_stderr\": 0.010189361609566652,\n\
\ \"acc_norm\": 0.8745733788395904,\n \"acc_norm_stderr\": 0.009678644555462999\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6278629755028878,\n\
\ \"acc_stderr\": 0.004823867761332464,\n \"acc_norm\": 0.8329018123879706,\n\
\ \"acc_norm_stderr\": 0.0037230107458783956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253837,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253837\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857733,\n\
\ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857733\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.01332134844761175,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.01332134844761175\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6342592592592593,\n \"acc_stderr\": 0.032847388576472056,\n \"\
acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.032847388576472056\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316935,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316935\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758538,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594626,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741617,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741617\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6078212290502794,\n\
\ \"acc_stderr\": 0.016329061073207453,\n \"acc_norm\": 0.6078212290502794,\n\
\ \"acc_norm_stderr\": 0.016329061073207453\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7901234567901234,\n \"acc_stderr\": 0.02265834408598137,\n\
\ \"acc_norm\": 0.7901234567901234,\n \"acc_norm_stderr\": 0.02265834408598137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6114732724902217,\n\
\ \"acc_stderr\": 0.012448817838292376,\n \"acc_norm\": 0.6114732724902217,\n\
\ \"acc_norm_stderr\": 0.012448817838292376\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201264,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201264\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7467320261437909,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.7467320261437909,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.609547123623011,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.7279131434968619,\n\
\ \"mc2_stderr\": 0.012814436118254086\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050374\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39272175890826383,\n \
\ \"acc_stderr\": 0.013451745349586566\n }\n}\n```"
repo_url: https://huggingface.co/DopeorNope/COKAL-v1-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|arc:challenge|25_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|gsm8k|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hellaswag|10_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T10-21-56.669760.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- '**/details_harness|winogrande|5_2023-12-10T10-21-56.669760.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T10-21-56.669760.parquet'
- config_name: results
data_files:
- split: 2023_12_10T10_21_56.669760
path:
- results_2023-12-10T10-21-56.669760.parquet
- split: latest
path:
- results_2023-12-10T10-21-56.669760.parquet
---
# Dataset Card for Evaluation run of DopeorNope/COKAL-v1-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DopeorNope/COKAL-v1-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DopeorNope/COKAL-v1-70B](https://huggingface.co/DopeorNope/COKAL-v1-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:21:56.669760](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B/blob/main/results_2023-12-10T10-21-56.669760.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6806080675011864,
"acc_stderr": 0.031026141939535783,
"acc_norm": 0.6871684287339627,
"acc_norm_stderr": 0.03163298834751675,
"mc1": 0.609547123623011,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.7279131434968619,
"mc2_stderr": 0.012814436118254086
},
"harness|arc:challenge|25": {
"acc": 0.8583617747440273,
"acc_stderr": 0.010189361609566652,
"acc_norm": 0.8745733788395904,
"acc_norm_stderr": 0.009678644555462999
},
"harness|hellaswag|10": {
"acc": 0.6278629755028878,
"acc_stderr": 0.004823867761332464,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.0037230107458783956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253837,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253837
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02548549837334323,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02548549837334323
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857733,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857733
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.01332134844761175,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.01332134844761175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.032847388576472056,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.032847388576472056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316935,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758538,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741617,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741617
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6078212290502794,
"acc_stderr": 0.016329061073207453,
"acc_norm": 0.6078212290502794,
"acc_norm_stderr": 0.016329061073207453
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7901234567901234,
"acc_stderr": 0.02265834408598137,
"acc_norm": 0.7901234567901234,
"acc_norm_stderr": 0.02265834408598137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6114732724902217,
"acc_stderr": 0.012448817838292376,
"acc_norm": 0.6114732724902217,
"acc_norm_stderr": 0.012448817838292376
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7467320261437909,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.7467320261437909,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.609547123623011,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.7279131434968619,
"mc2_stderr": 0.012814436118254086
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050374
},
"harness|gsm8k|5": {
"acc": 0.39272175890826383,
"acc_stderr": 0.013451745349586566
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
carnival13/rbrt_eval_sur | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 13846819
num_examples: 6970
download_size: 3906257
dataset_size: 13846819
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rbrt_eval_sur"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/elbe_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elbe/エルベ/易北 (Azur Lane)
This is the dataset of elbe/エルベ/易北 (Azur Lane), containing 218 images and their tags.
The core tags of this character are `long_hair, breasts, purple_eyes, multicolored_hair, large_breasts, streaked_hair, white_hair, black_hair, fang, skin_fang, hat, black_headwear, peaked_cap, very_long_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 218 | 421.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbe_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 218 | 205.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbe_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 592 | 479.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbe_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 218 | 358.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbe_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 592 | 754.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbe_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elbe_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_dress, black_gloves, cleavage, frilled_dress, layered_dress, looking_at_viewer, official_alternate_costume, open_mouth, solo, iron_cross, underboob_cutout, blush, puffy_short_sleeves, simple_background, two-tone_hair, upper_body, white_background |
| 1 | 14 |  |  |  |  |  | 1girl, beer_mug, black_dress, black_gloves, frilled_dress, iron_cross, looking_at_viewer, official_alternate_costume, cleavage, layered_dress, solo, open_mouth, grey_hair, holding_tray, underboob_cutout, black_footwear, puffy_short_sleeves, black_socks, blush, two-tone_hair |
| 2 | 39 |  |  |  |  |  | 1girl, solo, bare_shoulders, black_gloves, frilled_dress, looking_at_viewer, layered_dress, shoulder_cutout, iron_cross, black_dress, simple_background, black_corset, open_mouth, white_leotard, white_background, smile, black_thighhighs, blush, black_ribbon |
| 3 | 33 |  |  |  |  |  | 1girl, cleavage, solo, brown_sweater, looking_at_viewer, sunglasses, eyewear_on_head, twintails, bare_shoulders, garter_straps, two-tone_hair, blush, black_skirt, hair_ribbon, necklace, bangs, long_sleeves, black_bra, black_ribbon, off-shoulder_sweater, official_alternate_costume, sitting, black_thighhighs, collarbone, hair_between_eyes, handbag, open_mouth, smile |
| 4 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, twintails, black_bikini, blush, outdoors, solo, thighs, bangs, hair_bow, beach, cleavage, collarbone, jewelry, navel, open_mouth, smile, wet, bare_shoulders, black_thighhighs, side-tie_bikini_bottom, water, day, ocean, sand, sky, black_ribbon, cloud, grey_hair, hair_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | black_gloves | cleavage | frilled_dress | layered_dress | looking_at_viewer | official_alternate_costume | open_mouth | solo | iron_cross | underboob_cutout | blush | puffy_short_sleeves | simple_background | two-tone_hair | upper_body | white_background | beer_mug | grey_hair | holding_tray | black_footwear | black_socks | bare_shoulders | shoulder_cutout | black_corset | white_leotard | smile | black_thighhighs | black_ribbon | brown_sweater | sunglasses | eyewear_on_head | twintails | garter_straps | black_skirt | hair_ribbon | necklace | bangs | long_sleeves | black_bra | off-shoulder_sweater | sitting | collarbone | hair_between_eyes | handbag | black_bikini | outdoors | thighs | hair_bow | beach | jewelry | navel | wet | side-tie_bikini_bottom | water | day | ocean | sand | sky | cloud |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:-----------|:----------------|:----------------|:--------------------|:-----------------------------|:-------------|:-------|:-------------|:-------------------|:--------|:----------------------|:--------------------|:----------------|:-------------|:-------------------|:-----------|:------------|:---------------|:-----------------|:--------------|:-----------------|:------------------|:---------------|:----------------|:--------|:-------------------|:---------------|:----------------|:-------------|:------------------|:------------|:----------------|:--------------|:--------------|:-----------|:--------|:---------------|:------------|:-----------------------|:----------|:-------------|:--------------------|:----------|:---------------|:-----------|:---------|:-----------|:--------|:----------|:--------|:------|:-------------------------|:--------|:------|:--------|:-------|:------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 39 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | X | | X | | X | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 33 |  |  |  |  |  | X | | | X | | | X | X | X | X | | | X | | | X | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | | X | | | X | | X | X | | | X | | | | | | | X | | | | X | | | | X | X | X | | | | X | | | X | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_golaxy__gogpt-7b-bloom | ---
pretty_name: Evaluation run of golaxy/gogpt-7b-bloom
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [golaxy/gogpt-7b-bloom](https://huggingface.co/golaxy/gogpt-7b-bloom) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gogpt-7b-bloom\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T21:01:38.341280](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-7b-bloom/blob/main/results_2023-10-14T21-01-38.341280.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2214765100671141,\n\
\ \"em_stderr\": 0.004252451287967787,\n \"f1\": 0.25772336409395996,\n\
\ \"f1_stderr\": 0.00428261897007673,\n \"acc\": 0.31452249408050514,\n\
\ \"acc_stderr\": 0.006788199951115784\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2214765100671141,\n \"em_stderr\": 0.004252451287967787,\n\
\ \"f1\": 0.25772336409395996,\n \"f1_stderr\": 0.00428261897007673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6290449881610103,\n\
\ \"acc_stderr\": 0.013576399902231568\n }\n}\n```"
repo_url: https://huggingface.co/golaxy/gogpt-7b-bloom
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|arc:challenge|25_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T07_35_20.075381
path:
- '**/details_harness|drop|3_2023-09-17T07-35-20.075381.parquet'
- split: 2023_10_14T21_01_38.341280
path:
- '**/details_harness|drop|3_2023-10-14T21-01-38.341280.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T21-01-38.341280.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T07_35_20.075381
path:
- '**/details_harness|gsm8k|5_2023-09-17T07-35-20.075381.parquet'
- split: 2023_10_14T21_01_38.341280
path:
- '**/details_harness|gsm8k|5_2023-10-14T21-01-38.341280.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T21-01-38.341280.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hellaswag|10_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T10:56:27.356745.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T10:56:27.356745.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T07_35_20.075381
path:
- '**/details_harness|winogrande|5_2023-09-17T07-35-20.075381.parquet'
- split: 2023_10_14T21_01_38.341280
path:
- '**/details_harness|winogrande|5_2023-10-14T21-01-38.341280.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T21-01-38.341280.parquet'
- config_name: results
data_files:
- split: 2023_07_31T10_56_27.356745
path:
- results_2023-07-31T10:56:27.356745.parquet
- split: 2023_09_17T07_35_20.075381
path:
- results_2023-09-17T07-35-20.075381.parquet
- split: 2023_10_14T21_01_38.341280
path:
- results_2023-10-14T21-01-38.341280.parquet
- split: latest
path:
- results_2023-10-14T21-01-38.341280.parquet
---
# Dataset Card for Evaluation run of golaxy/gogpt-7b-bloom
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gogpt-7b-bloom
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gogpt-7b-bloom](https://huggingface.co/golaxy/gogpt-7b-bloom) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gogpt-7b-bloom",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T21:01:38.341280](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-7b-bloom/blob/main/results_2023-10-14T21-01-38.341280.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2214765100671141,
"em_stderr": 0.004252451287967787,
"f1": 0.25772336409395996,
"f1_stderr": 0.00428261897007673,
"acc": 0.31452249408050514,
"acc_stderr": 0.006788199951115784
},
"harness|drop|3": {
"em": 0.2214765100671141,
"em_stderr": 0.004252451287967787,
"f1": 0.25772336409395996,
"f1_stderr": 0.00428261897007673
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6290449881610103,
"acc_stderr": 0.013576399902231568
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SURF-FluidSimulation/FluidSimulation | ---
license: cc-by-nc-4.0
---
# SURF: A Generalisation Benchmark for GNNs Predicting Fluid Dynamics
SURF, is a benchmark designed to test the generalization of learned graph-based fluid simulators. The benchmark consists of seven independent datasets:
- Base
- Turned
- Topo
- Range
- Dynamic
- Full
- FullFiner
Each dataset is available as separate *.zip file and consists of at least 1200 2D incompressible fluid flow simulations with 300 timesteps.
The data structure is as follows:
- folder: dataset_name
- folders: dpx
- files: sim.npz, triangles.py, constrained_kmeans_20.npy, Simulation_dp1_Timestep_50.png
- folder: Splits
- files: train.txt, test.txt, valid.txt
The file sim.npz (numpy archive) contains the result of the simulation for each timestep at each node:
- 'pointcloud': x, y coordinates
- 'VX': velocity in x-direction
- 'VY': velocity in y-direction
- 'PS': static pressure
- 'PG': dynamic pressure
- 'T': temperature
- 'TC': thermal conductivity of fluid
- 'HC': heat capacity of fluid
The results have the following shape: VX.shape=(#timesteps, #nodes, 1).
The file triangles.py contains the mesh connectivity. triangles.shape=(#timesteps, #elements, 3). Each triangle is defined by the node numbers in counter clockwise direction.
|
autoevaluate/autoeval-staging-eval-project-61fd61be-9af8-4428-ac3c-2fe701ee60d1-2119 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
zolak/twitter_dataset_78_1713152643 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 346079
num_examples: 834
download_size: 177314
dataset_size: 346079
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gorkemsevinc/Training_Dataset_for_chatbots_Virtual_Assistants | ---
dataset_info:
features:
- name: utterance
dtype: string
- name: category
dtype: string
- name: intent
dtype: string
splits:
- name: train
num_bytes: 748747
num_examples: 9712
download_size: 137346
dataset_size: 748747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_generalized_third_person_s | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2528804
num_examples: 15387
- name: test
num_bytes: 24868182
num_examples: 150753
- name: train
num_bytes: 22908072
num_examples: 139262
download_size: 31084547
dataset_size: 50305058
---
# Dataset Card for "MULTI_VALUE_qqp_generalized_third_person_s"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dinhanhx/google-wit-vi | ---
license: cc
task_categories:
- image-to-text
task_ids:
- image-captioning
language:
- vi
size_categories:
- 100M<n<1B
pretty_name: Google WIT Vietnamese
---
# Google WIT Vietnamese
This data repos contain extracted data from [Google WIT](https://github.com/google-research-datasets/wit/blob/main/DATA.md). The extracted data is all for Vietnamese language.
Given `x` is a data point in the OG dataset which has keys following OG `field_name`, the criteria to filter is
```python
criteria = lambda x: x.get("language", "") == "vi" and x.get("caption_reference_description", "")
```
## Text-related details
All `.tsv.gz` files follow OG data files in terms of file names and file structures.
### Train split
`wit_v1.train.*.tsv.gz`
Train data length of each file (not including the header),
```
17690
17756
17810
17724
17619
17494
17624
17696
17777
17562
```
Total 176752
### Validation split
`wit_v1.val.*.tsv.gz`
Val data length of each file (not including the header),
```
292
273
275
320
306
```
Total 1466
### Test split
`wit_v1.test.*.tsv.gz`
Test data length of each file (not including the header),
```
215
202
201
201
229
```
Total 1048
## Image-related details
### Image URL only
`*.image_url_list.txt` are simply lists of image urls from `*.tsv.gz` files
Image url length of each file (train, val, test, all)
```
157281
1271
900
159452
```
Google Research has made sure that all sets don't share same exact images.
### Downloaded Images
⚠ Please for the love of the gods, read this section carefully.
For `all.index.fmt_id.image_url_list.tsv`, from left to right, without headers, the columns are `index`, `fmt_id`, `image_url`. It is to map `image_url` (in `all.image_url_list.txt`) to `fmt_id`. It's for downloading images.
`fmt_id` is:
- used to name images (with proper image extensions) in `images/`.
- `index` but filled with 6 zeros
Downloading time was less than 36 hours with:
- 90 Mbps
- Processor Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz 1.99 GHz
- No asynchronous
For `fail.index.fmt_id.status.image_url_list.tsv`, from left to right, without headers, the columns are `index`, `fmt_id`, `status`, `image_url`. It is to track image urls (during downloading) that are inaccessible.
3367 image urls returned 404 (`status` values). In other words, we were able to download 97.88839275% of images.
`images/` folder takes disk space of:
- 215 GBs (uncompressed)
- 209 GBs (compressed)
We use Pillow to open each image to make sure that downloaded images are usable. We also log all faulty files in `corrupted_image_list.json`. There are less than 70 image files.
For `corrupted_image_list.json`, for each item in this list, the keys are `file_name`, `error`. `file_name` is `fmt_id` with extension but without `images/`. Some errors are either:
- files exceed Pillow default limit
- files are truncated
To actually load those files, the following code can be used to change Pillow behavior
```python
from PIL import Image, ImageFile
# For very big image files
Image.MAX_IMAGE_PIXELS = None
# For truncated image files
ImageFile.LOAD_TRUNCATED_IMAGES = True
```
Zip `images/` folder,
```bash
zip -r images.zip images/
zip images.zip --out spanned_images.zip -s 40g
```
https://superuser.com/questions/336219/how-do-i-split-a-zip-file-into-multiple-segments
Unzip `spanned_images.*` files,
```bash
zip -s 0 spanned_images.zip --out images.zip
unzip images.zip
```
https://unix.stackexchange.com/questions/40480/how-to-unzip-a-multipart-spanned-zip-on-linux |
harpreetsahota/quantization_experiment_results | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: unquantized_generated_text
dtype: string
- name: unquantized_execution_time (s)
dtype: float64
- name: bnb_quantized_generated_text
dtype: string
- name: bnb_quantized_execution_time (s)
dtype: float64
- name: gptq_4bit_generated_text
dtype: string
- name: gptq_4bit_execution_time (s)
dtype: float64
- name: gptq_2bit_generated_text
dtype: string
- name: gptq_2bit_execution_time (s)
dtype: float64
- name: gguf_quantized_generated_text
dtype: string
- name: gguf_quantized_execution_time (s)
dtype: float64
splits:
- name: train
num_bytes: 179988
num_examples: 50
download_size: 107125
dataset_size: 179988
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_96 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1257986632
num_examples: 245126
download_size: 1284594683
dataset_size: 1257986632
---
# Dataset Card for "chunk_96"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WillHeld/mtop | ---
dataset_info:
features:
- name: id
dtype: int64
- name: ' intent'
dtype: string
- name: ' slot'
dtype: string
- name: ' utterance'
dtype: string
- name: ' domain'
dtype: string
- name: ' locale'
dtype: string
- name: ' dcp_form'
dtype: string
- name: ' tokens'
dtype: string
- name: intent
dtype: string
- name: slot
dtype: string
- name: utterance
dtype: string
- name: domain
dtype: string
- name: locale
dtype: string
- name: dcp_form
dtype: string
- name: tokens
dtype: string
splits:
- name: eval_en
num_bytes: 2077234
num_examples: 2235
- name: test_en
num_bytes: 4090856
num_examples: 4386
- name: train_en
num_bytes: 14501480
num_examples: 15667
- name: eval_de
num_bytes: 1764320
num_examples: 1815
- name: test_de
num_bytes: 3439946
num_examples: 3549
- name: train_de
num_bytes: 13122042
num_examples: 13424
- name: eval_es
num_bytes: 1594238
num_examples: 1527
- name: test_es
num_bytes: 3089782
num_examples: 2998
- name: train_es
num_bytes: 11277514
num_examples: 10934
- name: eval_fr
num_bytes: 1607082
num_examples: 1577
- name: test_fr
num_bytes: 3289276
num_examples: 3193
- name: train_fr
num_bytes: 12147836
num_examples: 11814
- name: eval_hi
num_bytes: 2618172
num_examples: 2012
- name: test_hi
num_bytes: 3491690
num_examples: 2789
- name: train_hi
num_bytes: 14225324
num_examples: 11330
- name: eval_th
num_bytes: 2251378
num_examples: 1671
- name: test_th
num_bytes: 3654864
num_examples: 2765
- name: train_th
num_bytes: 14277512
num_examples: 10759
download_size: 16165451
dataset_size: 112520546
---
# Dataset Card for "mtop"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bdsaglam/webnlg-jerx-sft-mt-ms-openai | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 11817219
num_examples: 11796
- name: dev
num_bytes: 1496804
num_examples: 1500
- name: test
num_bytes: 2681853
num_examples: 2415
download_size: 5332198
dataset_size: 15995876
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
AdapterOcean/oasst_top1_standardized_cluster_1_std | ---
dataset_info:
features:
- name: message_type
dtype: string
- name: message
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7551627
num_examples: 12334
download_size: 4161724
dataset_size: 7551627
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_top1_standardized_cluster_1_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Baidicoot/comic_villain_completions | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: adv_completion
dtype: string
- name: clean_completion
dtype: string
splits:
- name: ft_data
num_bytes: 1412071
num_examples: 2500
- name: laa_data
num_bytes: 1461228
num_examples: 2500
download_size: 1833979
dataset_size: 2873299
configs:
- config_name: default
data_files:
- split: ft_data
path: data/ft_data-*
- split: laa_data
path: data/laa_data-*
---
|
open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers | ---
pretty_name: Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ogimgio/gpt-neo-125m-neurallinguisticpioneers](https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T03:45:38.309218](https://huggingface.co/datasets/open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers/blob/main/results_2023-10-25T03-45-38.309218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893281,\n \"f1\": 0.041171350671141034,\n\
\ \"f1_stderr\": 0.0012117312202057759,\n \"acc\": 0.2564958864222914,\n\
\ \"acc_stderr\": 0.007403214467064075\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893281,\n\
\ \"f1\": 0.041171350671141034,\n \"f1_stderr\": 0.0012117312202057759\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225334\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5122336227308603,\n \"acc_stderr\": 0.014048278820405616\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T03_45_38.309218
path:
- '**/details_harness|drop|3_2023-10-25T03-45-38.309218.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T03-45-38.309218.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T03_45_38.309218
path:
- '**/details_harness|gsm8k|5_2023-10-25T03-45-38.309218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T03-45-38.309218.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:56:53.861726.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:56:53.861726.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T03_45_38.309218
path:
- '**/details_harness|winogrande|5_2023-10-25T03-45-38.309218.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T03-45-38.309218.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_56_53.861726
path:
- results_2023-07-19T13:56:53.861726.parquet
- split: 2023_10_25T03_45_38.309218
path:
- results_2023-10-25T03-45-38.309218.parquet
- split: latest
path:
- results_2023-10-25T03-45-38.309218.parquet
---
# Dataset Card for Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ogimgio/gpt-neo-125m-neurallinguisticpioneers](https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T03:45:38.309218](https://huggingface.co/datasets/open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers/blob/main/results_2023-10-25T03-45-38.309218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893281,
"f1": 0.041171350671141034,
"f1_stderr": 0.0012117312202057759,
"acc": 0.2564958864222914,
"acc_stderr": 0.007403214467064075
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893281,
"f1": 0.041171350671141034,
"f1_stderr": 0.0012117312202057759
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225334
},
"harness|winogrande|5": {
"acc": 0.5122336227308603,
"acc_stderr": 0.014048278820405616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BrachioLab/supernova-timeseries | ---
license: mit
dataset_info:
features:
- name: objid
dtype: int32
- name: times_wv
dtype:
array2_d:
shape:
- 300
- 2
dtype: float64
- name: target
dtype:
array2_d:
shape:
- 300
- 2
dtype: float64
- name: label
dtype:
class_label:
names:
'0': $\mu$-Lens-Single
'1': TDE
'2': EB
'3': SNII
'4': SNIax
'5': Mira
'6': SNIbc
'7': KN
'8': M-dwarf
'9': SNIa-91bg
'10': AGN
'11': SNIa
'12': RRL
'13': SLSN-I
'14': extra
- name: redshift
dtype: float32
splits:
- name: train
num_bytes: 75438576
num_examples: 6274
- name: validation
num_bytes: 9402768
num_examples: 782
- name: test
num_bytes: 9523008
num_examples: 792
download_size: 33374835
dataset_size: 94364352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
yzhuang/metatree_mv | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 2151256
num_examples: 28306
- name: validation
num_bytes: 947112
num_examples: 12462
download_size: 2823534
dataset_size: 3098368
---
# Dataset Card for "metatree_mv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/product-photography-v1-tiny-prompts-tasks-collage | ---
dataset_info:
features:
- name: product_id
dtype: string
- name: id
dtype: string
- name: category
dtype: string
- name: product
dtype: string
- name: prompt
dtype: string
- name: task_id
dtype: string
- name: status
dtype: string
- name: midjourney_image
dtype: image
splits:
- name: train
num_bytes: 28250972235.316814
num_examples: 3982
- name: validation
num_bytes: 3774708422.5457873
num_examples: 527
download_size: 32506212607
dataset_size: 32025680657.862602
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/8c4d782c | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1334
dataset_size: 182
---
# Dataset Card for "8c4d782c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
janldeboer/reddit-relationships | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling | ---
pretty_name: Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AIGym/deepseek-coder-6.7b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-6.7b-chat-and-function-calling)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T23:57:59.059131](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling/blob/main/results_2024-02-04T23-57-59.059131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3821937199083958,\n\
\ \"acc_stderr\": 0.0343357567097525,\n \"acc_norm\": 0.385141155502206,\n\
\ \"acc_norm_stderr\": 0.035093328270488326,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.4283133114995368,\n\
\ \"mc2_stderr\": 0.01473438764724853\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.33276450511945393,\n \"acc_stderr\": 0.013769863046192312,\n\
\ \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.01403476138617546\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41236805417247563,\n\
\ \"acc_stderr\": 0.0049125470401328785,\n \"acc_norm\": 0.5380402310296754,\n\
\ \"acc_norm_stderr\": 0.004975319435777099\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.030325945789286105,\n\
\ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.030325945789286105\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842508,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842508\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38387096774193546,\n\
\ \"acc_stderr\": 0.02766618207553964,\n \"acc_norm\": 0.38387096774193546,\n\
\ \"acc_norm_stderr\": 0.02766618207553964\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380025,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380025\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4292929292929293,\n\
\ \"acc_stderr\": 0.035265527246011986,\n \"acc_norm\": 0.4292929292929293,\n\
\ \"acc_norm_stderr\": 0.035265527246011986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.39896373056994816,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.39896373056994816,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.023854795680971128,\n\
\ \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.023854795680971128\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3706422018348624,\n \"acc_stderr\": 0.020707458164352984,\n \"\
acc_norm\": 0.3706422018348624,\n \"acc_norm_stderr\": 0.020707458164352984\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.35294117647058826,\n\
\ \"acc_stderr\": 0.03354092437591518,\n \"acc_norm\": 0.35294117647058826,\n\
\ \"acc_norm_stderr\": 0.03354092437591518\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.35864978902953587,\n \"acc_stderr\": 0.031219569445301843,\n\
\ \"acc_norm\": 0.35864978902953587,\n \"acc_norm_stderr\": 0.031219569445301843\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.04562951548180765,\n \"acc_norm\"\
: 0.512396694214876,\n \"acc_norm_stderr\": 0.04562951548180765\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.048257293373563895,\n\
\ \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.048257293373563895\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n\
\ \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n\
\ \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4061302681992337,\n\
\ \"acc_stderr\": 0.017562037406478916,\n \"acc_norm\": 0.4061302681992337,\n\
\ \"acc_norm_stderr\": 0.017562037406478916\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422622,\n\
\ \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422622\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.015268677317602257,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.015268677317602257\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.02814640599309636,\n\
\ \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.02814640599309636\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.02795048149440126,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.02795048149440126\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611334,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611334\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2926988265971317,\n\
\ \"acc_stderr\": 0.01162094919584953,\n \"acc_norm\": 0.2926988265971317,\n\
\ \"acc_norm_stderr\": 0.01162094919584953\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.32516339869281047,\n \"acc_stderr\": 0.018950886770806304,\n \
\ \"acc_norm\": 0.32516339869281047,\n \"acc_norm_stderr\": 0.018950886770806304\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n\
\ \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.43781094527363185,\n\
\ \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066163,\n\
\ \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.4283133114995368,\n\
\ \"mc2_stderr\": 0.01473438764724853\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839944\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17210007581501138,\n \
\ \"acc_stderr\": 0.010397328057879003\n }\n}\n```"
repo_url: https://huggingface.co/AIGym/deepseek-coder-6.7b-chat-and-function-calling
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|arc:challenge|25_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|gsm8k|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hellaswag|10_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T23-57-59.059131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- '**/details_harness|winogrande|5_2024-02-04T23-57-59.059131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T23-57-59.059131.parquet'
- config_name: results
data_files:
- split: 2024_02_04T23_57_59.059131
path:
- results_2024-02-04T23-57-59.059131.parquet
- split: latest
path:
- results_2024-02-04T23-57-59.059131.parquet
---
# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-6.7b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-6.7b-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:57:59.059131](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling/blob/main/results_2024-02-04T23-57-59.059131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3821937199083958,
"acc_stderr": 0.0343357567097525,
"acc_norm": 0.385141155502206,
"acc_norm_stderr": 0.035093328270488326,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.4283133114995368,
"mc2_stderr": 0.01473438764724853
},
"harness|arc:challenge|25": {
"acc": 0.33276450511945393,
"acc_stderr": 0.013769863046192312,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.01403476138617546
},
"harness|hellaswag|10": {
"acc": 0.41236805417247563,
"acc_stderr": 0.0049125470401328785,
"acc_norm": 0.5380402310296754,
"acc_norm_stderr": 0.004975319435777099
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.030325945789286105,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.030325945789286105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842508,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842508
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38387096774193546,
"acc_stderr": 0.02766618207553964,
"acc_norm": 0.38387096774193546,
"acc_norm_stderr": 0.02766618207553964
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380025,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380025
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39896373056994816,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.39896373056994816,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33076923076923076,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.33076923076923076,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3706422018348624,
"acc_stderr": 0.020707458164352984,
"acc_norm": 0.3706422018348624,
"acc_norm_stderr": 0.020707458164352984
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.03354092437591518,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.03354092437591518
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.35864978902953587,
"acc_stderr": 0.031219569445301843,
"acc_norm": 0.35864978902953587,
"acc_norm_stderr": 0.031219569445301843
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.048257293373563895,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.048257293373563895
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.0312561082442188,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.0312561082442188
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4061302681992337,
"acc_stderr": 0.017562037406478916,
"acc_norm": 0.4061302681992337,
"acc_norm_stderr": 0.017562037406478916
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422622,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422622
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602257,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602257
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.02814640599309636,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.02814640599309636
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.02795048149440126,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.02795048149440126
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611334,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611334
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2926988265971317,
"acc_stderr": 0.01162094919584953,
"acc_norm": 0.2926988265971317,
"acc_norm_stderr": 0.01162094919584953
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32516339869281047,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.32516339869281047,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.03733756969066163,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.03733756969066163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.4283133114995368,
"mc2_stderr": 0.01473438764724853
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839944
},
"harness|gsm8k|5": {
"acc": 0.17210007581501138,
"acc_stderr": 0.010397328057879003
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
adityasihag/math_QAaugP | ---
license: mit
---
math_QAaugP dataset is a combination of [MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA), [MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct), and some internal data.
We use [Arithmo](https://huggingface.co/datasets/akjindal53244/Arithmo-Data) dataset for the combination of MetaMathQA and MathInstruct.
|
liuyanchen1015/VALUE_rte_null_relcl | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 27785
num_examples: 62
- name: test
num_bytes: 269307
num_examples: 602
- name: train
num_bytes: 229297
num_examples: 501
download_size: 355526
dataset_size: 526389
---
# Dataset Card for "VALUE_rte_null_relcl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
multimodalart/panda-70m | ---
task_categories:
- image-to-text
- text-to-image
- text-to-video
- image-to-video
language:
- en
pretty_name: Panda-70M
splits:
- name: train
- name: train_2m
- name: train_10m
- name: test
- name: validation
---
# [Panda 70M dataset by Snap Inc](https://snap-research.github.io/Panda-70M/)
### 70M video-caption pairs
Code for downloading: https://github.com/snap-research/Panda-70M/dataset_dataloading |
distilled-from-one-sec-cv12/chunk_105 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 978364480
num_examples: 190640
download_size: 1000035243
dataset_size: 978364480
---
# Dataset Card for "chunk_105"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datahrvoje/twitter_dataset_1713078743 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24626
num_examples: 54
download_size: 13380
dataset_size: 24626
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
suolyer/pile_philpapers | ---
license: apache-2.0
---
|
sanderland/sobotta-anatomical-dataset | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 54613498.0
num_examples: 14
download_size: 33366858
dataset_size: 54613498.0
---
# Dataset Card for "sobotta-anatomical-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PL-MTEB/sicke-pl-pairclassification | ---
license: cc-by-nc-sa-3.0
---
|
shawt100/shawt100 | ---
license: openrail
---
from datasets import load_dataset
# If the dataset is gated/private, make sure you have run huggingface-cli login
dataset = load_dataset("shawt100/shawt100")
|
dvilasuero/sharegpt-curation | ---
size_categories: 10K<n<100K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for sharegpt-curation
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("dvilasuero/sharegpt-curation")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("dvilasuero/sharegpt-curation")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt | FieldTypes.text | True | True |
| response | Response | FieldTypes.text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| prompt-quality | Prompt Quality | QuestionTypes.rating | True | N/A | [1, 2, 3, 4, 5, 6, 7] |
| prompt-intent | Prompt Intent | QuestionTypes.label_selection | True | N/A | ['generation', 'rewrite', 'extract', 'closed-qa', 'open-qa', 'classification', 'summarization', 'brainstorming', 'chat', 'code', 'other'] |
| response-toxicity | Response Toxicity | QuestionTypes.multi_label_selection | False | N/A | ['illegal', 'harmfull', 'unqualified advice'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| prompt_token_length_mean | Prompt Token Length Mean | MetadataPropertyTypes.float | None - None | True |
| prompt_token_length_median | Prompt Token Length Median | MetadataPropertyTypes.float | None - None | True |
| prompt_token_length_std | Prompt Token Length Std | MetadataPropertyTypes.float | None - None | True |
| prompt_sentence_length_mean | Prompt Sentence Length Mean | MetadataPropertyTypes.float | None - None | True |
| prompt_sentence_length_median | Prompt Sentence Length Median | MetadataPropertyTypes.float | None - None | True |
| prompt_sentence_length_std | Prompt Sentence Length Std | MetadataPropertyTypes.float | None - None | True |
| prompt_syllables_per_token_mean | Prompt Syllables Per Token Mean | MetadataPropertyTypes.float | None - None | True |
| prompt_syllables_per_token_median | Prompt Syllables Per Token Median | MetadataPropertyTypes.float | None - None | True |
| prompt_syllables_per_token_std | Prompt Syllables Per Token Std | MetadataPropertyTypes.float | None - None | True |
| prompt_n_tokens | Prompt N Tokens | MetadataPropertyTypes.integer | None - None | True |
| prompt_n_unique_tokens | Prompt N Unique Tokens | MetadataPropertyTypes.integer | None - None | True |
| prompt_proportion_unique_tokens | Prompt Proportion Unique Tokens | MetadataPropertyTypes.float | None - None | True |
| prompt_n_characters | Prompt N Characters | MetadataPropertyTypes.integer | None - None | True |
| prompt_n_sentences | Prompt N Sentences | MetadataPropertyTypes.integer | None - None | True |
| response_token_length_mean | Response Token Length Mean | MetadataPropertyTypes.float | None - None | True |
| response_token_length_median | Response Token Length Median | MetadataPropertyTypes.float | None - None | True |
| response_token_length_std | Response Token Length Std | MetadataPropertyTypes.float | None - None | True |
| response_sentence_length_mean | Response Sentence Length Mean | MetadataPropertyTypes.float | None - None | True |
| response_sentence_length_median | Response Sentence Length Median | MetadataPropertyTypes.float | None - None | True |
| response_sentence_length_std | Response Sentence Length Std | MetadataPropertyTypes.float | None - None | True |
| response_syllables_per_token_mean | Response Syllables Per Token Mean | MetadataPropertyTypes.float | None - None | True |
| response_syllables_per_token_median | Response Syllables Per Token Median | MetadataPropertyTypes.float | None - None | True |
| response_syllables_per_token_std | Response Syllables Per Token Std | MetadataPropertyTypes.float | None - None | True |
| response_n_tokens | Response N Tokens | MetadataPropertyTypes.integer | None - None | True |
| response_n_unique_tokens | Response N Unique Tokens | MetadataPropertyTypes.integer | None - None | True |
| response_proportion_unique_tokens | Response Proportion Unique Tokens | MetadataPropertyTypes.float | None - None | True |
| response_n_characters | Response N Characters | MetadataPropertyTypes.integer | None - None | True |
| response_n_sentences | Response N Sentences | MetadataPropertyTypes.integer | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"prompt": "You\u0027re an ebook writer. Write an ebook with title \"Intelligent Kitchen\" and provide sub headline for 50 winning products",
"response": "\u003cdiv class=\"markdown prose w-full break-words dark:prose-invert light\"\u003e\u003cp\u003eTitle: Intelligent Kitchen: How to Transform Your Cooking Experience with Smart Technology\u003c/p\u003e\u003cp\u003eSub Headline: 50 Winning Products to Upgrade Your Kitchen into an Intelligent One\u003c/p\u003e\u003cp\u003eIntroduction:\nThe kitchen is the heart of any home, and it\u0027s where we spend a lot of our time cooking and preparing meals for our loved ones. With the advancement of technology, our kitchens are becoming more intelligent, making cooking and entertaining easier, faster, and more enjoyable. This ebook is a guide for anyone who wants to transform their kitchen into an intelligent one by incorporating smart technology. We\u0027ve curated a list of 50 winning products that will make your cooking experience more efficient, convenient, and fun.\u003c/p\u003e\u003cp\u003eChapter 1: Smart Appliances\u003c/p\u003e\u003col\u003e\u003cli\u003eSmart Ovens\u003c/li\u003e\u003cli\u003eSmart Fridges\u003c/li\u003e\u003cli\u003eSmart Dishwashers\u003c/li\u003e\u003cli\u003eSmart Coffee Makers\u003c/li\u003e\u003cli\u003eSmart Slow Cookers\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 2: Voice Assistants\n6. Amazon Echo\u003c/p\u003e\u003col start=\"7\"\u003e\u003cli\u003eGoogle Home\u003c/li\u003e\u003cli\u003eApple HomePod\u003c/li\u003e\u003cli\u003eSonos One\u003c/li\u003e\u003cli\u003eJBL Link\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 3: Smart Lighting\n11. Philips Hue\u003c/p\u003e\u003col start=\"12\"\u003e\u003cli\u003eLIFX\u003c/li\u003e\u003cli\u003eSylvania Smart+\u003c/li\u003e\u003cli\u003eIKEA Tradfri\u003c/li\u003e\u003cli\u003eGE Smart Bulbs\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 4: Smart Kitchen Gadgets\n16. Anova Sous Vide\u003c/p\u003e\u003col start=\"17\"\u003e\u003cli\u003eInstant Pot\u003c/li\u003e\u003cli\u003eSmart Scale\u003c/li\u003e\u003cli\u003eSmart Thermometer\u003c/li\u003e\u003cli\u003eFood Dehydrator\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 5: Smart Cookware\n21. Smart Pans\u003c/p\u003e\u003col start=\"22\"\u003e\u003cli\u003eSmart Grills\u003c/li\u003e\u003cli\u003eSmart Crockpots\u003c/li\u003e\u003cli\u003eSmart Pressure Cookers\u003c/li\u003e\u003cli\u003eSmart Rice Cookers\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 6: Smart Home Security\n26. Smart Locks\u003c/p\u003e\u003col start=\"27\"\u003e\u003cli\u003eSmart Doorbells\u003c/li\u003e\u003cli\u003eSecurity Cameras\u003c/li\u003e\u003cli\u003eSmart Smoke Detectors\u003c/li\u003e\u003cli\u003eSmart Carbon Monoxide Detectors\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 7: Smart Kitchen Storage\n31. Smart Refrigerator Organization\u003c/p\u003e\u003col start=\"32\"\u003e\u003cli\u003eSmart Pantry\u003c/li\u003e\u003cli\u003eSmart Drawer Organizers\u003c/li\u003e\u003cli\u003eSmart Spice Racks\u003c/li\u003e\u003cli\u003eSmart Cabinets\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 8: Smart Countertops\n36. Smart Cutting Boards\u003c/p\u003e\u003col start=\"37\"\u003e\u003cli\u003eSmart Kitchen Scales\u003c/li\u003e\u003cli\u003eSmart Kettles\u003c/li\u003e\u003cli\u003eSmart Toasters\u003c/li\u003e\u003cli\u003eSmart Blenders\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 9: Smart Kitchen Entertainment\n41. Smart TVs\u003c/p\u003e\u003col start=\"42\"\u003e\u003cli\u003eSmart Speakers\u003c/li\u003e\u003cli\u003eSmart Soundbars\u003c/li\u003e\u003cli\u003eSmart Projectors\u003c/li\u003e\u003cli\u003eSmart Streaming Devices\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 10: Smart Kitchen Decor\n46. Smart Artwork\u003c/p\u003e\u003col start=\"47\"\u003e\u003cli\u003eSmart Lighting Fixtures\u003c/li\u003e\u003cli\u003eSmart Kitchen Faucets\u003c/li\u003e\u003cli\u003eSmart Kitchen Mats\u003c/li\u003e\u003cli\u003eSmart Plants\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eConclusion:\nIncorporating smart technology into your kitchen can make a significant difference in your cooking experience. These 50 winning products can help you create an intelligent kitchen that is not only functional but also beautiful and entertaining. Whether you\u0027re a beginner or an experienced cook, there\u0027s something on this list that can enhance your cooking skills and make your life easier. So, get started on your journey towards an intelligent kitchen today!\u003c/p\u003e\u003c/div\u003e"
},
"metadata": {
"prompt_n_characters": 102,
"prompt_n_sentences": 2,
"prompt_n_tokens": 19,
"prompt_n_unique_tokens": 17,
"prompt_proportion_unique_tokens": 0.8947368421052632,
"prompt_sentence_length_mean": 9.5,
"prompt_sentence_length_median": 9.5,
"prompt_sentence_length_std": 5.5,
"prompt_syllables_per_token_mean": 1.3157894736842106,
"prompt_syllables_per_token_median": 1.0,
"prompt_syllables_per_token_std": 0.7292845505553167,
"prompt_token_length_mean": 5.052631578947368,
"prompt_token_length_median": 5.0,
"prompt_token_length_std": 2.416471638578736,
"response_n_characters": 2715,
"response_n_sentences": 37,
"response_n_tokens": 443,
"response_n_unique_tokens": 229,
"response_proportion_unique_tokens": 0.5169300225733634,
"response_sentence_length_mean": 11.972972972972974,
"response_sentence_length_median": 11.0,
"response_sentence_length_std": 8.706898434610393,
"response_syllables_per_token_mean": 1.4943566591422122,
"response_syllables_per_token_median": 1.0,
"response_syllables_per_token_std": 0.8948509487194239,
"response_token_length_mean": 6.00902934537246,
"response_token_length_median": 5.0,
"response_token_length_std": 4.7444785260833635
},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{\"prompt_token_length_mean\": 5.052631578947368, \"prompt_token_length_median\": 5.0, \"prompt_token_length_std\": 2.416471638578736, \"prompt_sentence_length_mean\": 9.5, \"prompt_sentence_length_median\": 9.5, \"prompt_sentence_length_std\": 5.5, \"prompt_syllables_per_token_mean\": 1.3157894736842106, \"prompt_syllables_per_token_median\": 1.0, \"prompt_syllables_per_token_std\": 0.7292845505553167, \"prompt_n_tokens\": 19, \"prompt_n_unique_tokens\": 17, \"prompt_proportion_unique_tokens\": 0.8947368421052632, \"prompt_n_characters\": 102, \"prompt_n_sentences\": 2, \"response_token_length_mean\": 6.00902934537246, \"response_token_length_median\": 5.0, \"response_token_length_std\": 4.7444785260833635, \"response_sentence_length_mean\": 11.972972972972974, \"response_sentence_length_median\": 11.0, \"response_sentence_length_std\": 8.706898434610393, \"response_syllables_per_token_mean\": 1.4943566591422122, \"response_syllables_per_token_median\": 1.0, \"response_syllables_per_token_std\": 0.8948509487194239, \"response_n_tokens\": 443, \"response_n_unique_tokens\": 229, \"response_proportion_unique_tokens\": 0.5169300225733634, \"response_n_characters\": 2715, \"response_n_sentences\": 37}",
"prompt": "You\u0027re an ebook writer. Write an ebook with title \"Intelligent Kitchen\" and provide sub headline for 50 winning products",
"prompt-intent": [],
"prompt-intent-suggestion": null,
"prompt-intent-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"prompt-quality": [],
"prompt-quality-suggestion": null,
"prompt-quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"response": "\u003cdiv class=\"markdown prose w-full break-words dark:prose-invert light\"\u003e\u003cp\u003eTitle: Intelligent Kitchen: How to Transform Your Cooking Experience with Smart Technology\u003c/p\u003e\u003cp\u003eSub Headline: 50 Winning Products to Upgrade Your Kitchen into an Intelligent One\u003c/p\u003e\u003cp\u003eIntroduction:\nThe kitchen is the heart of any home, and it\u0027s where we spend a lot of our time cooking and preparing meals for our loved ones. With the advancement of technology, our kitchens are becoming more intelligent, making cooking and entertaining easier, faster, and more enjoyable. This ebook is a guide for anyone who wants to transform their kitchen into an intelligent one by incorporating smart technology. We\u0027ve curated a list of 50 winning products that will make your cooking experience more efficient, convenient, and fun.\u003c/p\u003e\u003cp\u003eChapter 1: Smart Appliances\u003c/p\u003e\u003col\u003e\u003cli\u003eSmart Ovens\u003c/li\u003e\u003cli\u003eSmart Fridges\u003c/li\u003e\u003cli\u003eSmart Dishwashers\u003c/li\u003e\u003cli\u003eSmart Coffee Makers\u003c/li\u003e\u003cli\u003eSmart Slow Cookers\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 2: Voice Assistants\n6. Amazon Echo\u003c/p\u003e\u003col start=\"7\"\u003e\u003cli\u003eGoogle Home\u003c/li\u003e\u003cli\u003eApple HomePod\u003c/li\u003e\u003cli\u003eSonos One\u003c/li\u003e\u003cli\u003eJBL Link\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 3: Smart Lighting\n11. Philips Hue\u003c/p\u003e\u003col start=\"12\"\u003e\u003cli\u003eLIFX\u003c/li\u003e\u003cli\u003eSylvania Smart+\u003c/li\u003e\u003cli\u003eIKEA Tradfri\u003c/li\u003e\u003cli\u003eGE Smart Bulbs\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 4: Smart Kitchen Gadgets\n16. Anova Sous Vide\u003c/p\u003e\u003col start=\"17\"\u003e\u003cli\u003eInstant Pot\u003c/li\u003e\u003cli\u003eSmart Scale\u003c/li\u003e\u003cli\u003eSmart Thermometer\u003c/li\u003e\u003cli\u003eFood Dehydrator\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 5: Smart Cookware\n21. Smart Pans\u003c/p\u003e\u003col start=\"22\"\u003e\u003cli\u003eSmart Grills\u003c/li\u003e\u003cli\u003eSmart Crockpots\u003c/li\u003e\u003cli\u003eSmart Pressure Cookers\u003c/li\u003e\u003cli\u003eSmart Rice Cookers\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 6: Smart Home Security\n26. Smart Locks\u003c/p\u003e\u003col start=\"27\"\u003e\u003cli\u003eSmart Doorbells\u003c/li\u003e\u003cli\u003eSecurity Cameras\u003c/li\u003e\u003cli\u003eSmart Smoke Detectors\u003c/li\u003e\u003cli\u003eSmart Carbon Monoxide Detectors\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 7: Smart Kitchen Storage\n31. Smart Refrigerator Organization\u003c/p\u003e\u003col start=\"32\"\u003e\u003cli\u003eSmart Pantry\u003c/li\u003e\u003cli\u003eSmart Drawer Organizers\u003c/li\u003e\u003cli\u003eSmart Spice Racks\u003c/li\u003e\u003cli\u003eSmart Cabinets\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 8: Smart Countertops\n36. Smart Cutting Boards\u003c/p\u003e\u003col start=\"37\"\u003e\u003cli\u003eSmart Kitchen Scales\u003c/li\u003e\u003cli\u003eSmart Kettles\u003c/li\u003e\u003cli\u003eSmart Toasters\u003c/li\u003e\u003cli\u003eSmart Blenders\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 9: Smart Kitchen Entertainment\n41. Smart TVs\u003c/p\u003e\u003col start=\"42\"\u003e\u003cli\u003eSmart Speakers\u003c/li\u003e\u003cli\u003eSmart Soundbars\u003c/li\u003e\u003cli\u003eSmart Projectors\u003c/li\u003e\u003cli\u003eSmart Streaming Devices\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eChapter 10: Smart Kitchen Decor\n46. Smart Artwork\u003c/p\u003e\u003col start=\"47\"\u003e\u003cli\u003eSmart Lighting Fixtures\u003c/li\u003e\u003cli\u003eSmart Kitchen Faucets\u003c/li\u003e\u003cli\u003eSmart Kitchen Mats\u003c/li\u003e\u003cli\u003eSmart Plants\u003c/li\u003e\u003c/ol\u003e\u003cp\u003eConclusion:\nIncorporating smart technology into your kitchen can make a significant difference in your cooking experience. These 50 winning products can help you create an intelligent kitchen that is not only functional but also beautiful and entertaining. Whether you\u0027re a beginner or an experienced cook, there\u0027s something on this list that can enhance your cooking skills and make your life easier. So, get started on your journey towards an intelligent kitchen today!\u003c/p\u003e\u003c/div\u003e",
"response-toxicity": [],
"response-toxicity-suggestion": null,
"response-toxicity-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `FieldTypes.text`.
* **response** is of type `FieldTypes.text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **prompt-quality** is of type `QuestionTypes.rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7].
* **prompt-intent** is of type `QuestionTypes.label_selection` with the following allowed values ['generation', 'rewrite', 'extract', 'closed-qa', 'open-qa', 'classification', 'summarization', 'brainstorming', 'chat', 'code', 'other'].
* (optional) **response-toxicity** is of type `QuestionTypes.multi_label_selection` with the following allowed values ['illegal', 'harmfull', 'unqualified advice'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **prompt-quality-suggestion** is of type `QuestionTypes.rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7].
* (optional) **prompt-intent-suggestion** is of type `QuestionTypes.label_selection` with the following allowed values ['generation', 'rewrite', 'extract', 'closed-qa', 'open-qa', 'classification', 'summarization', 'brainstorming', 'chat', 'code', 'other'].
* (optional) **response-toxicity-suggestion** is of type `QuestionTypes.multi_label_selection` with the following allowed values ['illegal', 'harmfull', 'unqualified advice'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
# Remarks
## Classification
- Prompt in English, and response in another language -> toxicity: unqualified advice. (ratio: models are not supposed to answer in other languages)
- System prompts for initial message -> prompt intent: other. (ratio: this should have been a category for itself)
## Discards
- very short conversational prompts/responses (ratio: we cannot use this for proper model training)
- complete none-sense prompts and responses (ratio: we cannot use this for proper model training)
- prompts in other languages (ratio: langdetect for classification of languages might not work)
- prompts and responses that were clearly part of a chat-conversation (ratio: might be wrong data processing)
- prompts that assume the usage of tools like for example being able to scrape a website or something
# Guidelines
The ShareGPT dataset is a dataset that was collected by public users who were using the Google Chrome extension offered by [sharegpt.com](sharegpt.com) to share their ChatGPT conversations. This data should mimic real-life usage of the model and can therefore be used to fine-tune a model for an actual scenario. Additionally, Google was accused of using this dataset as a baseline to train its [BARD](https://www.theverge.com/2023/3/29/23662621/google-bard-chatgpt-sharegpt-training-denies) model.
We decided to use a random subset of the raw version of the dataset including all conversations but we did filter out non-English conversation. The raw dataset used can be found on [the Hugging Face hub](https://huggingface.co/datasets/zetavg/ShareGPT-Processed).
## Classification Tasks
The dataset aims to classify three things:
1. Quality
2. Intent
3. Toxicity
### Quality
For the quality, we have decided to define a rating question on a scale from 1 to 7. This question is used to assess the quality of the prompt, based on quality, grammar and clarity of context.
### Intent
For the intent, we have decided to define a single-label classification question which will be used to determine the prompt types as defined in the [Llama 2 paper](https://arxiv.org/abs/2307.09288). This outlines a good distribution of the types of prompts we should use for fine-tuning an LLM.
- Generation: A statement that indicates the need for open generation.
- Rewrite: A statement that is rewritten by the model.
- Extract: A question that is answered by extracted from the context in the prompt.
- Closed QA: A question that is answered by the model based on the context.
- Open QA: A question that is answered by the model without any context.
- Classification: A statement that is classified by the model.
- Summarization: A summary of the context.
- Brainstorming: A list of ideas or concepts.
- Chat: A chat between two people.
- Code: A statement about code or programming.
- Other: A prompt that does not fit into any of the above categories.
### Toxicity
For the Toxicity, we decided to use the [Guardrailing Mistral](https://docs.mistral.ai/usage/guardrailing) proposal as a reference. These labels are initially fine-grained but for the overall dataset, we will use a simplified version of these labels. As proposed by the Mistral team.
- Illegal: Illegal activity.
- Child abuse: child sexual abuse material or any content that exploits or harms children.
- Hate violence harassment: Generation of hateful, harassing, or violent content: content that expresses, incites, or promotes hate based on identity, content that intends to harass, threaten, or bully an individual, content that promotes or glorifies violence or celebrates the suffering or humiliation of others.
- Malware: Generation of malware: content that attempts to generate code that is designed to disrupt, damage, or gain unauthorized access to a computer system.
- Physical harm: activity that has high risk of physical harm, including: weapons development, military and warfare, management or operation of critical infrastructure in energy, transportation, and water, content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, and eating disorders.
- Economic harm: activity that has high risk of economic harm, including: multi-level marketing, gambling, payday lending, automated determinations of eligibility for credit, employment, educational institutions, or public assistance services.
- Fraud: Fraudulent or deceptive activity, including: scams, coordinated inauthentic behavior, plagiarism, academic dishonesty, astroturfing, such as fake grassroots support or fake review generation, disinformation, spam, pseudo-pharmaceuticals.
- Adult: Adult content, adult industries, and dating apps, including: content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness), erotic chat, pornography.
- Political: Political campaigning or lobbying, by: generating high volumes of campaign materials, generating campaign materials personalized to or targeted at specific demographics, building conversational or interactive systems such as chatbots that provide information about campaigns or engage in political advocacy or lobbying, building products for political campaigning or lobbying purposes.
- Privacy: Activity that violates people's privacy, including: tracking or monitoring an individual without their consent, facial recognition of private individuals, classifying individuals based on protected characteristics, using biometrics for identification or assessment, unlawful collection or disclosure of personal identifiable information or educational, financial, or other protected records.
- Unqualified law: Engaging in the unauthorized practice of law, or offering tailored legal advice without a qualified person reviewing the information.
- Unqualified financial: Offering tailored financial advice without a qualified person reviewing the information.
- Unqualified health: Telling someone that they have or do not have a certain health condition, or providing instructions on how to cure or treat a health condition.
These have been mapped to the following simplified labels:
- Illegal: Illegal activities such as terrorism, child abuse or fraud.
- Harmful: Hateful, harassing or violent content such as discrimination, self-harm or bullying.
- Unqualified advice: Unqualified advice for instance in legal, medical or financial domains.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-84000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 647224
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jing24/generate_sub_2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 57007597
num_examples: 62522
download_size: 10151425
dataset_size: 57007597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cfahlgren1/swti-stem-20k | ---
license: openrail
---
|
Elliot4AI/testpatent | ---
license: apache-2.0
task_categories:
- text-classification
language:
- zh
tags:
- chemistry
size_categories:
- n<1K
---
test |
intanm/IndoFinSent | ---
license: apache-2.0
---
|
HK83/real_people_3000 | ---
license: afl-3.0
---
|
inseq/divemt_attributions | ---
annotations_creators:
- machine-generated
license: apache-2.0
language:
- it
- ar
- nl
- tr
- uk
- vi
multilinguality:
- translation
task_categories:
- translation
pretty_name: divemt_attributions
size_categories:
- 1K<n<10K
---
# Dataset Card for DivEMT Attributions
*For more details on DivEMT, see our [EMNLP 2022 Paper](https://arxiv.org/abs/2205.12215) and our [Github repository](https://github.com/gsarti/divemt)*
## Dataset Description
- **DivEMT Source:** [DivEMT Github](https://github.com/gsarti/divemt)
- **Inseq Source:** [Inseq Github](https://github.com/inseq-team/inseq)
- **DivEMT Paper:** [DivEMT Arxiv](https://arxiv.org/abs/2205.12215)
- **Inseq Paper:** [Inseq Arxiv](https://arxiv.org/abs/2302.13942)
- **Point of Contact:** [Gabriele Sarti](mailto:g.sarti@rug.nl) |
ilbaks/github-issues | ---
language:
- en
task_categories:
- question-answering
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 12273341
num_examples: 1000
download_size: 3470168
dataset_size: 12273341
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for github_issues
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
Rogendo/English-Swahili-Sentence-Pairs | ---
task_categories:
- translation
- text-classification
- summarization
- feature-extraction
language:
- en
- sw
pretty_name: Eng-Swa-Pairs
size_categories:
- 100K<n<1M
--- |
pharaouk/biology_dataset_standardized_cluster_8 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/ja_asr.jsut_basic5000 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 1909852706.0
num_examples: 5000
download_size: 2250629871
dataset_size: 1909852706.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Dadmatech/informal2formal | ---
license: apache-2.0
---
|
Ateeqq/news-title-generator | ---
license: mit
task_categories:
- summarization
language:
- en
tags:
- news
size_categories:
- 10K<n<100K
---
# News Headline Generation Dataset
Suggest text for dataset Card 🤗 |
domro11/lectures | ---
license: openrail
---
|
ScottishHaze/PayMoneyWubby | ---
license: cc-by-nc-sa-4.0
---
Dataset for the PayMoneyWubby model, available here: https://huggingface.co/ScottishHaze/PayMoneyWubby |
autoevaluate/autoeval-eval-conll2003-conll2003-ce0414-48015145234 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: alvarobartt/distilbert-base-cased-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: alvarobartt/distilbert-base-cased-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@alvarobartt](https://huggingface.co/alvarobartt) for evaluating this model. |
nianlong/long-doc-extractive-summarization-gov-report | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini | ---
pretty_name: Evaluation run of rombodawg/LosslessMegaCoder-llama2-13b-mini
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rombodawg/LosslessMegaCoder-llama2-13b-mini](https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T11:42:02.372099](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini/blob/main/results_2023-09-17T11-42-02.372099.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0030411073825503355,\n\
\ \"em_stderr\": 0.0005638896908753115,\n \"f1\": 0.07890205536912773,\n\
\ \"f1_stderr\": 0.0016368809848969982,\n \"acc\": 0.4643729284759866,\n\
\ \"acc_stderr\": 0.010956919441194278\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.0005638896908753115,\n\
\ \"f1\": 0.07890205536912773,\n \"f1_stderr\": 0.0016368809848969982\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15921152388172857,\n \
\ \"acc_stderr\": 0.010077966717551878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483668\n\
\ }\n}\n```"
repo_url: https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|arc:challenge|25_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- '**/details_harness|drop|3_2023-09-17T11-42-02.372099.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T11-42-02.372099.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-02.372099.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-02.372099.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hellaswag|10_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T05_35_20.033036
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T05:35:20.033036.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T05:35:20.033036.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-02.372099.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-02.372099.parquet'
- config_name: results
data_files:
- split: 2023_09_17T11_42_02.372099
path:
- results_2023-09-17T11-42-02.372099.parquet
- split: latest
path:
- results_2023-09-17T11-42-02.372099.parquet
---
# Dataset Card for Evaluation run of rombodawg/LosslessMegaCoder-llama2-13b-mini
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rombodawg/LosslessMegaCoder-llama2-13b-mini](https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-13b-mini) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T11:42:02.372099](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-13b-mini/blob/main/results_2023-09-17T11-42-02.372099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753115,
"f1": 0.07890205536912773,
"f1_stderr": 0.0016368809848969982,
"acc": 0.4643729284759866,
"acc_stderr": 0.010956919441194278
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753115,
"f1": 0.07890205536912773,
"f1_stderr": 0.0016368809848969982
},
"harness|gsm8k|5": {
"acc": 0.15921152388172857,
"acc_stderr": 0.010077966717551878
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483668
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.