datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Rashedul12/test-4 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_digitous__Adventien-GPTJ | ---
pretty_name: Evaluation run of digitous/Adventien-GPTJ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [digitous/Adventien-GPTJ](https://huggingface.co/digitous/Adventien-GPTJ) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_digitous__Adventien-GPTJ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T18:30:15.376611](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__Adventien-GPTJ/blob/main/results_2023-09-22T18-30-15.376611.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.04690331375838923,\n\
\ \"f1_stderr\": 0.0011372681519599575,\n \"acc\": 0.309065548569777,\n\
\ \"acc_stderr\": 0.008601781393069023\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n\
\ \"f1\": 0.04690331375838923,\n \"f1_stderr\": 0.0011372681519599575\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.003447819272389025\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749022\n\
\ }\n}\n```"
repo_url: https://huggingface.co/digitous/Adventien-GPTJ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T18_30_15.376611
path:
- '**/details_harness|drop|3_2023-09-22T18-30-15.376611.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T18-30-15.376611.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T18_30_15.376611
path:
- '**/details_harness|gsm8k|5_2023-09-22T18-30-15.376611.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T18-30-15.376611.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:04:02.923110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:04:02.923110.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:04:02.923110.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T18_30_15.376611
path:
- '**/details_harness|winogrande|5_2023-09-22T18-30-15.376611.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T18-30-15.376611.parquet'
- config_name: results
data_files:
- split: 2023_07_19T20_04_02.923110
path:
- results_2023-07-19T20:04:02.923110.parquet
- split: 2023_09_22T18_30_15.376611
path:
- results_2023-09-22T18-30-15.376611.parquet
- split: latest
path:
- results_2023-09-22T18-30-15.376611.parquet
---
# Dataset Card for Evaluation run of digitous/Adventien-GPTJ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/digitous/Adventien-GPTJ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [digitous/Adventien-GPTJ](https://huggingface.co/digitous/Adventien-GPTJ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_digitous__Adventien-GPTJ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T18:30:15.376611](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__Adventien-GPTJ/blob/main/results_2023-09-22T18-30-15.376611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04690331375838923,
"f1_stderr": 0.0011372681519599575,
"acc": 0.309065548569777,
"acc_stderr": 0.008601781393069023
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04690331375838923,
"f1_stderr": 0.0011372681519599575
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.003447819272389025
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749022
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TrainingDataPro/amazon-reviews-dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- text-classification
language:
- en
tags:
- code
---
# Amazon Reviews Dataset
The Amazon Reviews Dataset is a comprehensive collection of customer reviews obtained from the popular e-commerce website, Amazon.com. This dataset encompasses reviews written in **5** different languages, making it a valuable resource for conducting **multilingual sentiment analysis and opinion mining**.
The dataset's multilingual nature makes it useful for natural language processing tasks, sentiment analysis algorithms, and other machine learning applications that require diverse language data for training and evaluation.
The dataset can be highly valuable in training and fine-tuning machine learning models to *automatically classify sentiments, predict customer satisfaction, or extract key information from customer reviews*.
### Languages in the dataset:
- Italian
- German
- Spainish
- French
- English
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/marketplace-scraping-data?utm_source=huggingface&utm_medium=cpc&utm_campaign=amazon-reviews-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
For each item, we extracted:
- **user_name**: name of the reviewer
- **stars**: number of stars given to the review
- **country**: country of the author
- **date**: date of the review
- **title**: title of the review
- **text**: text of the review
- **helpful**: number of people who think that the review is helpful
# Amazon Reviews might be collected in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/marketplace-scraping-data?utm_source=huggingface&utm_medium=cpc&utm_campaign=amazon-reviews-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
harpreetsahota/gemma_vibe_check_ifeval | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: DeciLM-7B-Instruct
dtype: string
- name: Gemma-7B-it
dtype: string
- name: conciseness_DeciLM-7B-Instruct
struct:
- name: reasoning
dtype: string
- name: score
dtype: 'null'
- name: value
dtype: string
- name: conciseness_Gemma-7B-it
struct:
- name: reasoning
dtype: string
- name: score
dtype: 'null'
- name: value
dtype: string
splits:
- name: train
num_bytes: 310360
num_examples: 100
download_size: 177103
dataset_size: 310360
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FanChen0116/19100_chat_05x_slot_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 5796
num_examples: 32
- name: validation
num_bytes: 5405
num_examples: 32
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 657930
---
# Dataset Card for "19100_chat_05x_slot_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_175 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 971721636
num_examples: 190833
download_size: 991426532
dataset_size: 971721636
---
# Dataset Card for "chunk_175"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-596cbd-1668659069 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-350m
metrics: ['f1', 'perplexity']
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-350m
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ddcas](https://huggingface.co/ddcas) for evaluating this model. |
ThWu/cleaned_prompt_r | ---
dataset_info:
features:
- name: conversations
sequence: string
splits:
- name: train
num_bytes: 157911562
num_examples: 268781
download_size: 97143836
dataset_size: 157911562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cleaned_prompt_r"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HichTala/yugioh_dataset | ---
license: mit
task_categories:
- image-classification
language:
- en
size_categories:
- 1K<n<10K
---
<div align="center">
<p>
<a href="https://www.github.com/hichtala/draw" target="_blank">
<img src="https://raw.githubusercontent.com/HichTala/draw/master/figures/banner-draw.png">
</p>
DRAW (which stands for **D**etect and **R**ecognize **A** **W**ild range of cards) is the very first object detector
trained to detect _Yu-Gi-Oh!_ cards in all types of images, and in particular in dueling images.
Other works exist (see [Related Works](#div-aligncenterrelated-worksdiv)) but none is capable of recognizing cards during a duel.
DRAW is entirely open source and all contributions are welcome.
</div>
---
## <div align="center">📄Documentation</div>
<details open>
<summary>
Install
</summary>
Both a docker installation and a more conventional installation are available. If you're not very familiar with all the code,
docker installation is recommended. Otherwise, opt for the classic installation.
#### Docker installation
If you are familiar with docker, the docker image is available [here](https://hub.docker.com/r/hichtala/draw).
Otherwise, I recommend you to download [DockerDesktop](https://www.docker.com/products/docker-desktop/) if you are on Windows.
If you are on Linux, you can refer to the documentation [here](https://docs.docker.com/engine/install/).
Once it is done, you simply have to execute the following command,
```shell
docker run -p 5000:5000 --name draw hichtala/draw:latest
```
Your installation is now completed. You can press `Ctrl+C` and continue to Usage section.
#### Classic installation
You need python to be installed. Python installation isn't going to be detailed here, you can refer to the [documentation](https://www.python.org/).
We first need to install pytorch. It is recommended to use a package manager such as [miniconda](https://docs.conda.io/projects/miniconda/en/latest/).
Please refer to the [documentation](https://docs.conda.io/projects/miniconda/en/latest/).
When everything is set up you can run the following command to install pytorch:
```shell
python -m pip install torch torchvision
```
If you want to use you gpus to make everything run faster, please refer the [documentation](https://pytorch.org/get-started/locally/)
Then you just have to clone the repo and install `requirements`:
```shell
git clone https://github.com/HichTala/draw
cd draw
python -m pip install -r requirements.txt
```
Your installation is now completed.
</details>
<details open>
<summary>Usage</summary>
Now to use it you need to download the models and the data, in section [Models and Data](#div-aligncentermodels-and-datadiv).
Once you have it, follow instruction depending on you have docker or classic installation.
Put all the model in the same folder, and keep the dataset as it is
#### Docker installation
You have to copy the data and models in the container. Execute the following command:
```shell
docker cp path/to/dataset/club_yugioh_dataset draw:/data
docker cp path/to/model/folder draw:/models
```
Once it is done you just have to run the command:
```shell
docker start draw
```
open the adress `localhost:5000`, and enjoy the maximum. Refer [bellow](#both) for details about parameters
#### Classic installation
You need to modify the `config.json` file by putting the paths of you dataset folder in `"data_path"` parameter
and the path to model folder in `"trained_models"` parameter.
Once done, just run:
```shell
flask --app app.py run
```
open the adress `localhost:5000`, and enjoy the maximum. Refer [bellow](#both) for details about parameters
#### Both
* In the first parameter, the one with gears, put the `config.json` file
* In the second parameter, the one with a camera, put the video you want to process (leave it empty to use your webcam)
* In the last one, put your deck list in the format `ydk`
Then you can press the button and start the process !
</details>
---
## <div align="center">⚙️Models and Data</div>
<details open>
<summary>Models</summary>
In this project, the tasks were divided so that one model would locate the card and another model would classify them.
Similarly, to classify the cards, I divided the task so that there is one model for each type of card,
and the model to be used was determined by the color of the card.
Models can be downloaded in <a href="https://huggingface.co/HichTala/draw">Hugging Face</a>.
Models starting with `beit` stands for classification and the one starting with `yolo` for localization.
[](https://huggingface.co/HichTala/draw)
For now only models for "retro" gameplay are available but the ones for classic format play will be added soon.
I considered "retro" format all cards before the first _syncro_ set, so all the cards edited until Light of Destruction set (LODT - 05/13/2008) set and all speed duel cards.
</details>
<details open>
<summary>Data</summary>
To create a dataset, the <a href="https://ygoprodeck.com/api-guide/">YGOPRODeck</a> api was used. Two datasets were thus created,
one for "retro" play and the other for classic format play. Just as there is a model for each type of card,
there is a dataset for each type of card.
Dataset can be downloaded in <a href="">Hugging Face</a>.
[](https://huggingface.co/datasets/HichTala/yugioh_dataset)
For now only "retro" dataset is available, but the one for classic format play will be added soon.
</details>
---
## <div align="center">💡Inspiration</div>
This project is inspired by content creator [SuperZouloux](https://www.youtube.com/watch?v=64-LfbggqKI)'s idea of a hologram bringing _Yu-Gi-Oh!_ cards to life.
His project uses chips inserted under the sleeves of each card,
which are read by the play mat, enabling the cards to be recognized.
Inserting the chips into the sleeves is not only laborious, but also poses another problem:
face-down cards are read in the same way as face-up ones.
So an automatic detector is a really suitable solution.
Although this project was discouraged by _KONAMI_ <sup>®</sup>, the game's publisher (which is quite understandable),
we can nevertheless imagine such a system being used to display the cards played during a live duel,
to allow spectators to read the cards.
---
## <div align="center">🔗Related Works</div>
Although to my knowledge `draw` is the first detector capable of locating and detecting _Yu-Gi-Oh!_ cards in a dueling environment,
other works exist and were a source of inspiration for this project. It's worth mentioning them here.
[Yu-Gi-Oh! NEURON](https://www.konami.com/games/eu/fr/products/yugioh_neuron/) is an official application developed by _KONAMI_ <sup>®</sup>.
It's packed with features, including cards recognition. The application is capable of recognizing a total of 20 cards at a time, which is very decent.
The drawback is that the cards must be of good quality to be recognized, which is not necessarily the case in a duel context.
What's more, it can't be integrated, so the only way to use it is to use the application.
[yugioh one shot learning](https://github.com/vanstorm9/yugioh-one-shot-learning) made by `vanstorm9` is a
Yu-Gi-Oh! cards classification program that allow you to recognize cards. It uses siamese network to train its classification
model. It gives very impressive results on images with a good quality but not that good on low quality images, and it
can't localize cards.
[Yolov8](https://github.com/ultralytics/ultralytics) is the last version of the very famous `yolo` family of object detector models.
I think it doesn't need to be presented today, it represents state-of-the-art real time object detection model.
[BEiT](https://arxiv.org/pdf/2106.08254.pdf) is a pre-trained model for image classification. It uses image transofrmers
which are based on attention mechanism. It suits our problem because authors also propose a pre-trained model in `Imagenet-22K`.
It is a dataset with 22k classes (more than most classifiers) which is interesting for our case since there is mode than 11k cards in _Yu-Gi-Oh!_.
---
## <div align="center">🔍Method Overview</div>
A medium blog will soon be written and published, explaining the main process from data collection to final prediction.
If you have any questions, don't hesitate to open an issue.
---
## <div align="center">💬Contact</div>
You can reach me on Twitter [@tiazden](https://twitter.com/tiazden) or by email at [hich.tala.phd@gmail.com](mailto:hich.tala.phd@gmail.com). |
qwer0213/CWC_dataset | ---
license: cc-by-4.0
---
|
cschell/xr-motion-dataset-catalogue | ---
language:
- en
tags:
- kinematic research
- XR user motions
- VR user motions
- AR user motions
- motions
pretty_name: XR Motion Dataset Catalogue
---
# XR Motion Dataset Catalogue
## Overview
The XR Motion Dataset Catalogue, accompanying our paper "Navigating the Kinematic Maze: A Comprehensive Guide to XR Motion Dataset Standards," standardizes and simplifies access to Extended Reality (XR) motion datasets. The catalogue represents our initiative to streamline the usage of kinematic data in XR research by aligning various datasets to a consistent format and structure.
### Dataset Specifications
All datasets in this catalogue have been standardized with the following specifications:
- **Coordinate System:** X (Right), Y (Up), Z (Forward)
- **Rotation Representation:** Quaternions
- **Units of Measurement:** Centimeters for spatial data
- **Time Encoding:** Milliseconds for time-related data
These specifications ensure uniformity and comparability across all datasets in the catalogue.
### Conversion Scripts Repository
The alignment of datasets was facilitated by a series of conversion scripts, which are available in our GitHub repository: [XR Motion Dataset Conversion Scripts](https://github.com/cschell/xr-motion-dataset-conversion-scripts). These scripts detail the process of aligning attribute names, coordinate systems, rotation representations, units of measurement, and time encoding.
### Included Datasets
The catalogue includes the following datasets:
1. [LiebersBeatSaber23](https://doi.org/10.1145/3611659.3615696)
2. [Boxrr23](https://doi.org/10.25350/B5NP4V) – *edit 2024-03-25: upload in progress*
3. BOXRR24 – *WIP: we are currently working on the next version of the BOXRR-23 dataset, which will include significantly more user – we do our best to make it available later this year*
4. [LiebersHand22](https://doi.org/10.1080/10447318.2022.2120845)
5. [LiebersLabStudy21](https://doi.org/10.1145/3411764.3445528)
6. [MooreCrossDomain23](https://doi.org/10.1109/ISMAR59233.2023.00054)
7. <del>[RMillerBall22](https://github.com/Terascale-All-sensing-Research-Studio/VR-Biometric-Authentication)</del> *request for permissions pending*
8. [VrNet](http://arxiv.org/abs/2306.03381)
9. [WhoIsAlyx](https://doi.org/10.3389/frvir.2023.1272234)
## Installation and Usage
### Loading the Dataset with Hugging Face `datasets` Library
To load a dataset from the catalogue, use the `datasets` library in Python. For example, to load the `WhoIsAlyx` dataset:
```python
from datasets import load_dataset
dataset = load_dataset("cschell/xr-motion-dataset-catalogue", "who_is_alyx", trust_remote_code=True)
```
### Loading Individual Recordings with Pandas
To load individual recordings, you can use `pandas`. Here's an example:
```python
import pandas as pd
file_url_path = "hf://datasets/cschell/xr-motion-dataset-catalogue/who_is_alyx/player_02/2022-01-07.parquet"
recording = pd.read_parquet(file_url_path)
```
## Contributing and Feedback
Contributions and feedback are welcome to enhance the XR Motion Dataset Catalogue. Feel free to open a pull request or contact us directly.
<!--
## Citation
If you use the XR Motion Dataset Catalogue in your research, please cite our paper:
```
@article{your_paper_identifier,
title={Navigating the Kinematic Maze: A Comprehensive Guide to XR Motion Dataset Standards},
author={Your Name and Other Authors},
journal={Journal Name},
year={Year}
}
``` -->
|
sgoedecke/powerful_owl_5s_16k | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': owl
'1': not_owl
- name: filename
dtype: string
splits:
- name: train
num_bytes: 549398261.936
num_examples: 1808
download_size: 500805913
dataset_size: 549398261.936
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JusperLee/VoxCeleb2-2Mix | ---
license: apache-2.0
language:
- en
tags:
- speech separation
size_categories:
- 100M<n<1B
---
The VoxCeleb2 dataset contains over one million sentences from 6,112 individuals
extracted from YouTube videos, divided into Dev and Test folders. We used the same dataset consistent with previous works
(Li et al., 2022; Gao & Grauman, 2021; Lee et al., 2021), constructed by selecting 5% of the data from the Dev folder of
VoxCeleb2 for creating training and validation sets. Similar to LRS2, VoxCeleb2 also contains a significant amount of noise
and reverberation, making it closer to real-world scenarios, but the acoustic environment of VoxCeleb2 is more complex and
challenging. It comprises 56-hour training, 3-hour validation, and 1.5-hour test sets. |
yuchenlin/G-PlanET | ---
task_categories:
- text-generation
- table-to-text
- table-question-answering
language:
- en
license: apache-2.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:** https://arxiv.org/abs/2209.00465
- **Leaderboard:**
- **Point of Contact:** yuchenlin1995@gmail.com
### Dataset Summary
This **G-PlanET** dataset is built on AI2 [ALFRED](https://leaderboard.allenai.org/alfred/submissions/get-started).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HoangHa/CleanedChemBL | ---
license: mit
---
|
CyberHarem/sakura_nene_newgame | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sakura Nene
This is the dataset of Sakura Nene, containing 278 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 278 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 633 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 698 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 278 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 278 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 278 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 633 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 633 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 568 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 698 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 698 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Sourabh2/Hindi_Poems | ---
dataset_info:
features:
- name: Poet's Name
dtype: string
- name: Period
dtype: string
- name: Language
dtype: string
- name: Additional Info
dtype: string
- name: Poem Text
dtype: string
splits:
- name: train
num_bytes: 791949
num_examples: 1151
download_size: 322710
dataset_size: 791949
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anan-2024/twitter_dataset_1713106333 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 126410
num_examples: 336
download_size: 69575
dataset_size: 126410
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
group2test/tutorial-images | ---
license: apache-2.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0074e854 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 181
num_examples: 10
download_size: 1322
dataset_size: 181
---
# Dataset Card for "0074e854"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eitanli/recipe_qa_dataset_v3 | ---
dataset_info:
features:
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 4158396.238977822
num_examples: 32933
- name: test
num_bytes: 519846.880511089
num_examples: 4117
- name: valid
num_bytes: 519846.880511089
num_examples: 4117
download_size: 1705587
dataset_size: 5198090.0
---
# Dataset Card for "recipe_qa_dataset_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ScarletDelphox/Eevee | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_rte_relativizer_where | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 225959
num_examples: 491
- name: train
num_bytes: 212749
num_examples: 457
download_size: 289780
dataset_size: 438708
---
# Dataset Card for "MULTI_VALUE_rte_relativizer_where"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bala1524/Drug_Combination_Extraction | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- medical
- biology
size_categories:
- 1K<n<10K
--- |
multi-train/SimpleWiki_1107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 57699115
num_examples: 102225
download_size: 29311247
dataset_size: 57699115
---
# Dataset Card for "SimpleWiki_1107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
health360/Ultrachat-Filtered-Multiple-Conversations-Alpaca-Tinyllama-Tokenized | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 3332012908
num_examples: 207865
download_size: 1088335043
dataset_size: 3332012908
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Ultrachat-Filtered-Multiple-Conversations-Alpaca-Tinyllama-Tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cetacean/tt | ---
license: openrail
---
|
hgbt/test | ---
license: unknown
---
|
AustinMcMike/Steve_Jobs_ChatML | ---
license: apache-2.0
---
|
NeelNanda/pile-tokenized-10b | ---
dataset_info:
features:
- name: tokens
sequence: uint16
splits:
- name: train
num_bytes: 22153340700
num_examples: 10795975
download_size: 19746448291
dataset_size: 22153340700
---
# Dataset Card for "pile-tokenized-10b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
leo4life/algoml_bookcorpus_bottom_10p | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 476567316
num_examples: 7400423
download_size: 297714041
dataset_size: 476567316
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_rte_not_preverbal_negator | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 47974
num_examples: 107
- name: train
num_bytes: 41428
num_examples: 83
download_size: 68985
dataset_size: 89402
---
# Dataset Card for "MULTI_VALUE_rte_not_preverbal_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nielsr/ship-detection-sliced-bis | ---
dataset_info:
features:
- name: image
dtype: image
- name: annotation
struct:
- name: bbox
sequence:
sequence: float64
splits:
- name: train
num_bytes: 6424334571.725
num_examples: 6765
- name: test
num_bytes: 7016088682.008
num_examples: 7782
download_size: 12048663562
dataset_size: 13440423253.733002
---
# Dataset Card for "ship-detection-sliced-bis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamo1139/AEZAKMI_v3 | ---
license: other
license_name: other
license_link: LICENSE
---
## Base information
AEZAKMI V3 is build on top of AEZAKMI V2 but there are many new samples.
I removed all coding samples plus those with "BEGINCONTEXT ENDCONTEXT References:" as they were bugging out the training with longer sequence len. \
I included filtered no_robots_sharegpt dataset, which makes this dataset non-commercial only! From no_robots, I removed stories, mentions of AI, coding etc. \
I added wsb dataset, based on Sentdex/wsb_reddit_v001, but I removed all samples shorter than 300 or 500 chars (i forgot now)
Finally, I removed all samples longer than 10000 chars from the dataset - my thinking is that those 13k-15k chars samples would have been given bigger weight during training assuming you have sample packing enabled.
They would have taken up more space in one particular sample and could introduce some errors similar to what I noticed with coding and BEGINCONTEXT "contextual" parts of airoboros. I don't want my model writing code or starting some weird context out of the blue, and those longer samples did that. |
liuyanchen1015/MULTI_VALUE_sst2_plural_postposed | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 73327
num_examples: 472
- name: test
num_bytes: 147654
num_examples: 945
- name: train
num_bytes: 2358428
num_examples: 20763
download_size: 1495737
dataset_size: 2579409
---
# Dataset Card for "MULTI_VALUE_sst2_plural_postposed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kousaka_umi_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kousaka_umi/高坂海美 (THE iDOLM@STER: Million Live!)
This is the dataset of kousaka_umi/高坂海美 (THE iDOLM@STER: Million Live!), containing 268 images and their tags.
The core tags of this character are `long_hair, brown_hair, braid, blue_eyes, twin_braids, bangs, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 268 | 298.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kousaka_umi_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 268 | 195.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kousaka_umi_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 620 | 398.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kousaka_umi_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 268 | 275.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kousaka_umi_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 620 | 520.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kousaka_umi_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kousaka_umi_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, aqua_eyes, open_mouth, :d, bracelet, necklace, navel |
| 1 | 9 |  |  |  |  |  | 1girl, smile, striped_bikini, blush, cleavage, looking_at_viewer, navel, open_mouth, solo, collarbone, white_background, front-tie_top, simple_background |
| 2 | 10 |  |  |  |  |  | day, smile, 1girl, navel, solo, cleavage, outdoors, looking_at_viewer, open_mouth, beach, blue_sky, bracelet, cloud, cowboy_shot, palm_tree, side-tie_bikini_bottom, standing |
| 3 | 14 |  |  |  |  |  | 1girl, solo, open_mouth, white_shirt, looking_at_viewer, short_sleeves, blush, short_shorts, midriff, navel, tied_shirt, :d, belt, blue_shorts, collarbone, hair_between_eyes, side_braid, denim_shorts, standing, white_background, simple_background, striped_shirt, crop_top, holding, very_long_hair |
| 4 | 7 |  |  |  |  |  | looking_at_viewer, open_mouth, 1girl, beret, red_neckerchief, solo, white_headwear, white_shirt, blue_skirt, red_bow, sleeveless_shirt, standing, white_sailor_collar, cowboy_shot, hair_bow, holding, miniskirt, pleated_skirt, short_shorts, shorts_under_skirt, white_gloves, :d, ;d, blue_sky, cloud, day, hat_bow, one_eye_closed, white_shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | aqua_eyes | open_mouth | :d | bracelet | necklace | navel | smile | striped_bikini | blush | cleavage | collarbone | white_background | front-tie_top | simple_background | day | outdoors | beach | blue_sky | cloud | cowboy_shot | palm_tree | side-tie_bikini_bottom | standing | white_shirt | short_sleeves | short_shorts | midriff | tied_shirt | belt | blue_shorts | hair_between_eyes | side_braid | denim_shorts | striped_shirt | crop_top | holding | very_long_hair | beret | red_neckerchief | white_headwear | blue_skirt | red_bow | sleeveless_shirt | white_sailor_collar | hair_bow | miniskirt | pleated_skirt | shorts_under_skirt | white_gloves | ;d | hat_bow | one_eye_closed | white_shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:------------|:-------------|:-----|:-----------|:-----------|:--------|:--------|:-----------------|:--------|:-----------|:-------------|:-------------------|:----------------|:--------------------|:------|:-----------|:--------|:-----------|:--------|:--------------|:------------|:-------------------------|:-----------|:--------------|:----------------|:---------------|:----------|:-------------|:-------|:--------------|:--------------------|:-------------|:---------------|:----------------|:-----------|:----------|:-----------------|:--------|:------------------|:-----------------|:-------------|:----------|:-------------------|:----------------------|:-----------|:------------|:----------------|:---------------------|:---------------|:-----|:----------|:-----------------|:---------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | | X | | X | | X | X | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | | X | X | | | X | | | X | | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | | X | X | | | | | | | | | | | | X | | | X | X | X | | | X | X | | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Mike36Theone/Fernando | ---
license: cdla-sharing-1.0
---
|
Nexdata/Russian_Conversational_Speech_Data_by_Mobile_Phone | ---
language:
- ru
task_categories:
- conversational
- automatic-speech-recognition
---
---
# Dataset Card for Nexdata/Russian_Conversational_Speech_Data_by_Mobile_Phone
## Description
The 107 Hours - Russian Conversational Speech Data involved 134 native speakers, developed with proper balance of gender ratio, Speakers would choose a few familiar topics out of the given list and start conversations to ensure dialogues' fluency and naturalness. The recording devices are various mobile phones. The audio format is 16kHz, 16bit, uncompressed WAV, and all the speech data was recorded in quiet indoor environments. All the speech audio was manually transcribed with text content, the start and end time of each effective sentence, and speaker identification.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1208?source=Huggingface
# Specifications
## Format
16kHz, 16bit, uncompressed wav, mono channel;
## Recording Environment
quiet indoor environment, without echo;
## Recording content
dozens of topics are specified, and the speakers make dialogue under those topics while the recording is performed;
## Demographics
134 speakers totally, with 47% males and 53% females;
## Annotation
annotating for the transcription text, speaker identification, gender and noise symbols
## Device
Android mobile phone, iPhone;
## Language
Russian
## Application scenarios
speech recognition; voiceprint recognition;
## Accuracy rate
the word accuracy rate is not less than 98%
# Licensing Information
Commercial License |
xwjzds/ag_news | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
splits:
- name: train
num_bytes: 29817303
num_examples: 120000
- name: test
num_bytes: 1879474
num_examples: 7600
download_size: 19820267
dataset_size: 31696777
---
# Dataset Card for "ag_news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NomeIncrivel/Cell | ---
license: openrail
---
|
longevity-genie/longevity-map-db-llm | ---
license: agpl-3.0
---
|
Dabou/Test | ---
task_categories:
- text-generation
language:
- fr
--- |
FVilmar/conectado_dias | ---
license: openrail
---
|
qgallouedec/prj_gia_dataset_metaworld_plate_slide_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the plate-slide-v2 environment, sample for the policy plate-slide-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_plate_slide_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_plate_slide_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
bob80333/animefacesv2 | ---
license: unknown
---
|
clarin-knext/scidocs-pl-qrels | ---
language:
- pl
---
Part of **BEIR-PL: Zero Shot Information Retrieval Benchmark for the Polish Language**.
Link to arxiv: https://arxiv.org/pdf/2305.19840.pdf
Contact: konrad.wojtasik@pwr.edu.pl |
daat/DATA | ---
language:
- zh
tags:
- not-for-all-audiences
size_categories:
- 100K<n<1M
---
password: `"KVmQt9UGNoHRLcNAMCqLtRj8kNNDnGNN"` (include "") |
autoevaluate/autoeval-staging-eval-autoevaluate__squad-sample-autoevaluate__squad-sample-778ba0-17436360 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/squad-sample
eval_info:
task: extractive_question_answering
model: autoevaluate/extractive-question-answering-not-evaluated
metrics: []
dataset_name: autoevaluate/squad-sample
dataset_config: autoevaluate--squad-sample
dataset_split: test
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: autoevaluate/extractive-question-answering-not-evaluated
* Dataset: autoevaluate/squad-sample
* Config: autoevaluate--squad-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
abacusai/ARC_DPO_FewShot | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 5112954
num_examples: 3357
- name: eval
num_bytes: 1446464
num_examples: 895
download_size: 1139410
dataset_size: 6559418
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---

# Dataset Card for "ARC_DPOP_FewShot"
[ARC](https://arxiv.org/abs/1803.05457) is a dataset that tests the level of understanding of science at approximately grade-school level.
We focus specifically on the 'Challenge' subsection of ARC, the more difficult of the two subsections, which has been widely adopted as a measure of LLM reasoning and world understanding.
We create a paired preference-ranked dataset from the train split of ARC-Challenge.
The dataset is partitioned into questions which we take as our prompts x, and four choices of responses to each question of which only one is the correct answer.
The correct response is taken as y_w and the incorrect responses are taken to be y_l; as there are three incorrect responses for every prompt, we repeat y_w multiple times for each prompt.
The dataset is meant to be used to fine-tune LLMs (which have already undergone SFT) using the DPOP loss function. We used this dataset to create the [Smaug series of models](https://github.com/abacusai/smaug). See our paper for more details.
The dataset contains 3357 training examples and 895 evaluation examples.
See more details in the [datasheet](https://github.com/abacusai/smaug/blob/main/datasheet.md).
|
communityai/Open-Orca___1million-gpt-4-300k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 556699683.6855309
num_examples: 300000
download_size: 294686219
dataset_size: 556699683.6855309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
frisken/test | ---
license: mit
---
|
kbatyshchev/race_formatted | ---
license: mit
---
|
vwxyzjn/cai-conversation-dev1704836562 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 265134
num_examples: 64
- name: train_prefs
num_bytes: 247352
num_examples: 64
download_size: 263052
dataset_size: 512486
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: train_prefs
path: data/train_prefs-*
---
# Dataset Card for "cai-conversation-dev1704836562"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_existential_there | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 5189
num_examples: 34
- name: test
num_bytes: 7839
num_examples: 53
- name: train
num_bytes: 77866
num_examples: 609
download_size: 43667
dataset_size: 90894
---
# Dataset Card for "MULTI_VALUE_sst2_existential_there"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datahrvoje/twitter_dataset_1713057539 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 23751
num_examples: 54
download_size: 11438
dataset_size: 23751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience-data/roots_indic-bn_wikivoyage | ---
language: bn
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-bn_wikivoyage
# wikivoyage_filtered
- Dataset uid: `wikivoyage_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0334 % of total
- 0.1097 % of en
- 0.0432 % of fr
- 0.0863 % of es
- 0.0084 % of zh
- 0.0892 % of vi
- 0.0464 % of indic-bn
- 0.0443 % of pt
- 0.0130 % of indic-hi
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
open-llm-leaderboard/details_sethuiyer__Dr_Samantha-7b | ---
pretty_name: Evaluation run of sethuiyer/Dr_Samantha-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sethuiyer/Dr_Samantha-7b](https://huggingface.co/sethuiyer/Dr_Samantha-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sethuiyer__Dr_Samantha-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:01:41.820538](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Dr_Samantha-7b/blob/main/results_2024-01-05T00-01-41.820538.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4806802958525362,\n\
\ \"acc_stderr\": 0.03430510555450672,\n \"acc_norm\": 0.4854584968099748,\n\
\ \"acc_norm_stderr\": 0.03506159854197481,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.01600265148736101,\n \"mc2\": 0.45584336369461415,\n\
\ \"mc2_stderr\": 0.015035191366607928\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48890784982935154,\n \"acc_stderr\": 0.014607794914013053,\n\
\ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.014568245550296358\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5848436566421031,\n\
\ \"acc_stderr\": 0.0049174193677660296,\n \"acc_norm\": 0.7795259908384784,\n\
\ \"acc_norm_stderr\": 0.004137190475425532\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.030723535249006107,\n\
\ \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.030723535249006107\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.04177578950739993,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.04177578950739993\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.02333065405453589,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.02333065405453589\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n\
\ \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n\
\ \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.032752644677915166,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.032752644677915166\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6660550458715596,\n \"acc_stderr\": 0.020220554196736407,\n \"\
acc_norm\": 0.6660550458715596,\n \"acc_norm_stderr\": 0.020220554196736407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353602,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353602\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674119,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674119\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6413502109704642,\n \"acc_stderr\": 0.03121956944530183,\n \
\ \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.03121956944530183\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456606,\n\
\ \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456606\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6679438058748404,\n\
\ \"acc_stderr\": 0.016841174655295724,\n \"acc_norm\": 0.6679438058748404,\n\
\ \"acc_norm_stderr\": 0.016841174655295724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n\
\ \"acc_stderr\": 0.014054314935614569,\n \"acc_norm\": 0.22905027932960895,\n\
\ \"acc_norm_stderr\": 0.014054314935614569\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.028217683556652315,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.028217683556652315\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.02762873715566877,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.02762873715566877\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34615384615384615,\n\
\ \"acc_stderr\": 0.012150699768228565,\n \"acc_norm\": 0.34615384615384615,\n\
\ \"acc_norm_stderr\": 0.012150699768228565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877743,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46568627450980393,\n \"acc_stderr\": 0.02018014484330729,\n \
\ \"acc_norm\": 0.46568627450980393,\n \"acc_norm_stderr\": 0.02018014484330729\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.047245774057315726,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.047245774057315726\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.01600265148736101,\n \"mc2\": 0.45584336369461415,\n\
\ \"mc2_stderr\": 0.015035191366607928\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983796\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18802122820318423,\n \
\ \"acc_stderr\": 0.010762621695354893\n }\n}\n```"
repo_url: https://huggingface.co/sethuiyer/Dr_Samantha-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-01-41.820538.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-01-41.820538.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- '**/details_harness|winogrande|5_2024-01-05T00-01-41.820538.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-01-41.820538.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_01_41.820538
path:
- results_2024-01-05T00-01-41.820538.parquet
- split: latest
path:
- results_2024-01-05T00-01-41.820538.parquet
---
# Dataset Card for Evaluation run of sethuiyer/Dr_Samantha-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sethuiyer/Dr_Samantha-7b](https://huggingface.co/sethuiyer/Dr_Samantha-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sethuiyer__Dr_Samantha-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:01:41.820538](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Dr_Samantha-7b/blob/main/results_2024-01-05T00-01-41.820538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4806802958525362,
"acc_stderr": 0.03430510555450672,
"acc_norm": 0.4854584968099748,
"acc_norm_stderr": 0.03506159854197481,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.01600265148736101,
"mc2": 0.45584336369461415,
"mc2_stderr": 0.015035191366607928
},
"harness|arc:challenge|25": {
"acc": 0.48890784982935154,
"acc_stderr": 0.014607794914013053,
"acc_norm": 0.53839590443686,
"acc_norm_stderr": 0.014568245550296358
},
"harness|hellaswag|10": {
"acc": 0.5848436566421031,
"acc_stderr": 0.0049174193677660296,
"acc_norm": 0.7795259908384784,
"acc_norm_stderr": 0.004137190475425532
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.030723535249006107,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.030723535249006107
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.04177578950739993,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.04177578950739993
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.02333065405453589,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.02333065405453589
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.032752644677915166,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.032752644677915166
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6660550458715596,
"acc_stderr": 0.020220554196736407,
"acc_norm": 0.6660550458715596,
"acc_norm_stderr": 0.020220554196736407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353602,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353602
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.03121956944530183,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.03121956944530183
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04712821257426769,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04712821257426769
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456606,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456606
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6679438058748404,
"acc_stderr": 0.016841174655295724,
"acc_norm": 0.6679438058748404,
"acc_norm_stderr": 0.016841174655295724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.014054314935614569,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.014054314935614569
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652315,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652315
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.02762873715566877,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.02762873715566877
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.012150699768228565,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.012150699768228565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877743,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46568627450980393,
"acc_stderr": 0.02018014484330729,
"acc_norm": 0.46568627450980393,
"acc_norm_stderr": 0.02018014484330729
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.047245774057315726,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.047245774057315726
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.01600265148736101,
"mc2": 0.45584336369461415,
"mc2_stderr": 0.015035191366607928
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983796
},
"harness|gsm8k|5": {
"acc": 0.18802122820318423,
"acc_stderr": 0.010762621695354893
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
baaaaaaaam/t_llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Back-up/test-edu-crawl | ---
dataset_info:
features:
- name: id
dtype: string
- name: titles
dtype: string
- name: url
dtype: string
- name: content
dtype: string
- name: html
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 4402870373
num_examples: 59735
download_size: 2313400577
dataset_size: 4402870373
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceM4/IIIT-5K-Sample | Invalid username or password. |
DerErikP/translations | ---
dataset_info:
features:
- name: do_translation
dtype: string
splits:
- name: train
num_bytes: 56974
num_examples: 100
download_size: 0
dataset_size: 56974
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "translations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/jervis_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jervis/ジャーヴィス (Kantai Collection)
This is the dataset of jervis/ジャーヴィス (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, hat, white_headwear, sailor_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 519.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jervis_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 312.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jervis_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1177 | 673.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jervis_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 467.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jervis_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1177 | 928.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jervis_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jervis_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 42 |  |  |  |  |  | 1girl, blue_sailor_collar, sailor_dress, short_sleeves, solo, white_dress, white_gloves, smile, looking_at_viewer, open_mouth, simple_background, white_background, cowboy_shot |
| 1 | 7 |  |  |  |  |  | 1girl, bangs, beret, black_bow, blue_sailor_collar, blush, cat_ears, fang, kemonomimi_mode, looking_at_viewer, open_mouth, solo, very_long_hair, white_dress, :d, heart, puffy_short_sleeves, black_shirt, hair_between_eyes, sailor_dress, white_background, cat_girl, cat_tail, white_gloves, animal_ear_fluff |
| 2 | 9 |  |  |  |  |  | 1girl, collarbone, looking_at_viewer, solo, covered_navel, open_mouth, small_breasts, blue_one-piece_swimsuit, cowboy_shot, blush, smile, old_school_swimsuit, twitter_username, alternate_costume, simple_background |
| 3 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, blush, navel, open_mouth, small_breasts, solo, white_panties, smile, underwear_only, collarbone, white_bra, sitting, white_gloves, bed, socks |
| 4 | 9 |  |  |  |  |  | 1girl, alternate_costume, blue_dress, smile, solo, beret, brown_coat, looking_at_viewer, black_headwear, cowboy_shot, long_sleeves, open_mouth, simple_background, bag, blush, heart |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_sailor_collar | sailor_dress | short_sleeves | solo | white_dress | white_gloves | smile | looking_at_viewer | open_mouth | simple_background | white_background | cowboy_shot | bangs | beret | black_bow | blush | cat_ears | fang | kemonomimi_mode | very_long_hair | :d | heart | puffy_short_sleeves | black_shirt | hair_between_eyes | cat_girl | cat_tail | animal_ear_fluff | collarbone | covered_navel | small_breasts | blue_one-piece_swimsuit | old_school_swimsuit | twitter_username | alternate_costume | navel | white_panties | underwear_only | white_bra | sitting | bed | socks | blue_dress | brown_coat | black_headwear | long_sleeves | bag |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:---------------|:----------------|:-------|:--------------|:---------------|:--------|:--------------------|:-------------|:--------------------|:-------------------|:--------------|:--------|:--------|:------------|:--------|:-----------|:-------|:------------------|:-----------------|:-----|:--------|:----------------------|:--------------|:--------------------|:-----------|:-----------|:-------------------|:-------------|:----------------|:----------------|:--------------------------|:----------------------|:-------------------|:--------------------|:--------|:----------------|:-----------------|:------------|:----------|:------|:--------|:-------------|:-------------|:-----------------|:---------------|:------|
| 0 | 42 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | | X | | | X | X | X | X | | X | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | | X | | X | X | X | X | | | | | | | X | | | | | | | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | X | | | X | X | X | X | | X | | X | | X | | | | | | X | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X |
|
adityarra07/train_ds_uwb | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 669064842.0330192
num_examples: 12613
- name: test
num_bytes: 26522827.322326932
num_examples: 500
download_size: 661104301
dataset_size: 695587669.3553461
---
# Dataset Card for "train_ds_uwb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ardaorcun/instruct-data | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3777246.430289243
num_examples: 2153
- name: test
num_bytes: 1621075.5697107571
num_examples: 924
download_size: 3040637
dataset_size: 5398322.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
abidlabs/cccccc | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO | ---
pretty_name: Evaluation run of NovoCode/Mistral-NeuralDPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NovoCode/Mistral-NeuralDPO](https://huggingface.co/NovoCode/Mistral-NeuralDPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T05:08:02.139201](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO/blob/main/results_2024-02-19T05-08-02.139201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377097887516342,\n\
\ \"acc_stderr\": 0.03236225279625848,\n \"acc_norm\": 0.6430625035121067,\n\
\ \"acc_norm_stderr\": 0.033015613528751424,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950234,\n \"mc2\": 0.4326565666387199,\n\
\ \"mc2_stderr\": 0.01546107570311087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131169,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6560446126269668,\n\
\ \"acc_stderr\": 0.004740555782142176,\n \"acc_norm\": 0.8469428400716988,\n\
\ \"acc_norm_stderr\": 0.003593067633474304\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281386,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281386\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381387,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n\
\ \"acc_stderr\": 0.014796502622562555,\n \"acc_norm\": 0.2670391061452514,\n\
\ \"acc_norm_stderr\": 0.014796502622562555\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888135,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888135\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701766,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701766\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988637,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988637\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950234,\n \"mc2\": 0.4326565666387199,\n\
\ \"mc2_stderr\": 0.01546107570311087\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626925\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38741470811220624,\n \
\ \"acc_stderr\": 0.013418798447827378\n }\n}\n```"
repo_url: https://huggingface.co/NovoCode/Mistral-NeuralDPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|arc:challenge|25_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|gsm8k|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hellaswag|10_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-08-02.139201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T05-08-02.139201.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- '**/details_harness|winogrande|5_2024-02-19T05-08-02.139201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T05-08-02.139201.parquet'
- config_name: results
data_files:
- split: 2024_02_19T05_08_02.139201
path:
- results_2024-02-19T05-08-02.139201.parquet
- split: latest
path:
- results_2024-02-19T05-08-02.139201.parquet
---
# Dataset Card for Evaluation run of NovoCode/Mistral-NeuralDPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/Mistral-NeuralDPO](https://huggingface.co/NovoCode/Mistral-NeuralDPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T05:08:02.139201](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO/blob/main/results_2024-02-19T05-08-02.139201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377097887516342,
"acc_stderr": 0.03236225279625848,
"acc_norm": 0.6430625035121067,
"acc_norm_stderr": 0.033015613528751424,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950234,
"mc2": 0.4326565666387199,
"mc2_stderr": 0.01546107570311087
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131169,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.013839039762820169
},
"harness|hellaswag|10": {
"acc": 0.6560446126269668,
"acc_stderr": 0.004740555782142176,
"acc_norm": 0.8469428400716988,
"acc_norm_stderr": 0.003593067633474304
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281386,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281386
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381387,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.014796502622562555,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.014796502622562555
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888135,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888135
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701766,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701766
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988637,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988637
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950234,
"mc2": 0.4326565666387199,
"mc2_stderr": 0.01546107570311087
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626925
},
"harness|gsm8k|5": {
"acc": 0.38741470811220624,
"acc_stderr": 0.013418798447827378
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/iris_konosuba | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of iris (Kono Subarashii Sekai ni Shukufuku wo!)
This is the dataset of iris (Kono Subarashii Sekai ni Shukufuku wo!), containing 40 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
uncleben006/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RikoteMaster/llama2_classifying_and_explainning_v4 | ---
dataset_info:
features:
- name: Explanation
dtype: string
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 14114420
num_examples: 13859
download_size: 3770656
dataset_size: 14114420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2_classifying_and_explainning_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_azarafrooz__mistral2-sp-lima-test | ---
pretty_name: Evaluation run of azarafrooz/mistral2-sp-lima-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/mistral2-sp-lima-test](https://huggingface.co/azarafrooz/mistral2-sp-lima-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__mistral2-sp-lima-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T18:14:42.775659](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__mistral2-sp-lima-test/blob/main/results_2024-03-12T18-14-42.775659.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2531055542088679,\n\
\ \"acc_stderr\": 0.030786940464820767,\n \"acc_norm\": 0.254650078940353,\n\
\ \"acc_norm_stderr\": 0.03161533222162643,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731608,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n \"\
acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.013318528460539426\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2605058753236407,\n\
\ \"acc_stderr\": 0.004380136468543944,\n \"acc_norm\": 0.26160127464648475,\n\
\ \"acc_norm_stderr\": 0.004386083683839616\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196665,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196665\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22201834862385322,\n\
\ \"acc_stderr\": 0.01781884956479663,\n \"acc_norm\": 0.22201834862385322,\n\
\ \"acc_norm_stderr\": 0.01781884956479663\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993656,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27053455019556716,\n\
\ \"acc_stderr\": 0.011345996743539264,\n \"acc_norm\": 0.27053455019556716,\n\
\ \"acc_norm_stderr\": 0.011345996743539264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731608,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4972375690607735,\n\
\ \"acc_stderr\": 0.014052271211616448\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/mistral2-sp-lima-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|arc:challenge|25_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|gsm8k|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hellaswag|10_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T18-14-42.775659.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T18-14-42.775659.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- '**/details_harness|winogrande|5_2024-03-12T18-14-42.775659.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T18-14-42.775659.parquet'
- config_name: results
data_files:
- split: 2024_03_12T18_14_42.775659
path:
- results_2024-03-12T18-14-42.775659.parquet
- split: latest
path:
- results_2024-03-12T18-14-42.775659.parquet
---
# Dataset Card for Evaluation run of azarafrooz/mistral2-sp-lima-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/mistral2-sp-lima-test](https://huggingface.co/azarafrooz/mistral2-sp-lima-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__mistral2-sp-lima-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T18:14:42.775659](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__mistral2-sp-lima-test/blob/main/results_2024-03-12T18-14-42.775659.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2531055542088679,
"acc_stderr": 0.030786940464820767,
"acc_norm": 0.254650078940353,
"acc_norm_stderr": 0.03161533222162643,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731608,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.21331058020477817,
"acc_stderr": 0.011970971742326334,
"acc_norm": 0.29436860068259385,
"acc_norm_stderr": 0.013318528460539426
},
"harness|hellaswag|10": {
"acc": 0.2605058753236407,
"acc_stderr": 0.004380136468543944,
"acc_norm": 0.26160127464648475,
"acc_norm_stderr": 0.004386083683839616
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196665,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196665
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479663,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479663
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993656,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27053455019556716,
"acc_stderr": 0.011345996743539264,
"acc_norm": 0.27053455019556716,
"acc_norm_stderr": 0.011345996743539264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731608,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616448
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ks21/Joe_Buck_the_GOATv3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 7232335.0
num_examples: 40
download_size: 3615356
dataset_size: 7232335.0
---
# Dataset Card for "Joe_Buck_the_GOATv3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NathanRoll/CV_train | ---
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
splits:
- name: train
num_bytes: 20472485884.512
num_examples: 564337
download_size: 25625771028
dataset_size: 20472485884.512
---
# Dataset Card for "CV_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ailyth/Playstation4GamePatchData | ---
license: mit
---
This dataset records the update information of all PS4 games, including CUSA ID, game title, and the latest version. Games that have never released patches (i.e., version 1.0) are not included in this dataset. |
autoevaluate/autoeval-staging-eval-emotion-default-e0ea2e-17426359 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: lewiswatson/distilbert-base-uncased-finetuned-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: lewiswatson/distilbert-base-uncased-finetuned-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
hiennguyennq/train_v4 | ---
license: apache-2.0
---
|
danielz01/xView1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: bbox
sequence:
sequence: float64
- name: category
sequence: int64
- name: path
dtype: string
- name: chip_id
dtype: int64
splits:
- name: train
num_bytes: 20904228042.0
num_examples: 26541
download_size: 17082075353
dataset_size: 20904228042.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xView1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rsalshalan/SADA | ---
license: apache-2.0
---
|
aneeshas/imsdb-horror-movie-scripts | ---
dataset_info:
features:
- name: Horror
dtype: string
splits:
- name: train
num_bytes: 37500587
num_examples: 150
download_size: 17513933
dataset_size: 37500587
---
# Dataset Card for "imsdb-horror-movie-scripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abhigadgil15/platypus_mini | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps | ---
pretty_name: Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dvruette/oasst-gpt-neox-20b-3000-steps](https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T10:27:17.935969](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps/blob/main/results_2023-09-17T10-27-17.935969.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03261325503355705,\n\
\ \"em_stderr\": 0.0018190171380944452,\n \"f1\": 0.08462353187919494,\n\
\ \"f1_stderr\": 0.0021613692798517184,\n \"acc\": 0.35813018759877047,\n\
\ \"acc_stderr\": 0.008817523952258153\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03261325503355705,\n \"em_stderr\": 0.0018190171380944452,\n\
\ \"f1\": 0.08462353187919494,\n \"f1_stderr\": 0.0021613692798517184\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \
\ \"acc_stderr\": 0.00460748428376746\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6874506708760852,\n \"acc_stderr\": 0.013027563620748847\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T10_27_17.935969
path:
- '**/details_harness|drop|3_2023-09-17T10-27-17.935969.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T10-27-17.935969.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T10_27_17.935969
path:
- '**/details_harness|gsm8k|5_2023-09-17T10-27-17.935969.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T10-27-17.935969.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:33:10.003072.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:33:10.003072.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T10_27_17.935969
path:
- '**/details_harness|winogrande|5_2023-09-17T10-27-17.935969.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T10-27-17.935969.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_33_10.003072
path:
- results_2023-07-19T21:33:10.003072.parquet
- split: 2023_09_17T10_27_17.935969
path:
- results_2023-09-17T10-27-17.935969.parquet
- split: latest
path:
- results_2023-09-17T10-27-17.935969.parquet
---
# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dvruette/oasst-gpt-neox-20b-3000-steps](https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T10:27:17.935969](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps/blob/main/results_2023-09-17T10-27-17.935969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03261325503355705,
"em_stderr": 0.0018190171380944452,
"f1": 0.08462353187919494,
"f1_stderr": 0.0021613692798517184,
"acc": 0.35813018759877047,
"acc_stderr": 0.008817523952258153
},
"harness|drop|3": {
"em": 0.03261325503355705,
"em_stderr": 0.0018190171380944452,
"f1": 0.08462353187919494,
"f1_stderr": 0.0021613692798517184
},
"harness|gsm8k|5": {
"acc": 0.02880970432145565,
"acc_stderr": 0.00460748428376746
},
"harness|winogrande|5": {
"acc": 0.6874506708760852,
"acc_stderr": 0.013027563620748847
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tollefj/xsum-en-no-transfer | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: document
dtype: string
- name: id
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 479736366
num_examples: 204045
- name: test
num_bytes: 26786028
num_examples: 11334
- name: validation
num_bytes: 26319794
num_examples: 11332
download_size: 338414434
dataset_size: 532842188
task_categories:
- summarization
- translation
language:
- nb
- 'no'
- en
pretty_name: xsum English to Norwegian
size_categories:
- 100K<n<1M
---
# XSUM - English to Norwegian Bokmål
An experiment in joint translation and summarization.
- English xsum: https://huggingface.co/datasets/xsum
- Norwegian (bokmål) translated xsum: https://huggingface.co/datasets/tollefj/norwegian-xsum-nob |
GEM/web_nlg | ---
annotations_creators:
- unknown
language_creators:
- unknown
language:
- en
license:
- cc-by-nc-4.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- table-to-text
task_ids: []
pretty_name: web_nlg
tags:
- data-to-text
---
# Dataset Card for GEM/web_nlg
## Dataset Description
- **Homepage:** https://webnlg-challenge.loria.fr/
- **Repository:** https://gitlab.com/shimorina/webnlg-dataset
- **Paper:** http://www.aclweb.org/anthology/P17-1017, [WebNLG Challenge 2017 Report
- **Leaderboard:** https://beng.dice-research.org/gerbil/
- **Point of Contact:** [Needs More Information]
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/web_nlg).
### Dataset Summary
WebNLG is a bi-lingual dataset (English, Russian) of parallel DBpedia triple sets and short texts that cover about 450 different DBpedia properties. The WebNLG data was originally created to promote the development of RDF verbalisers able to generate short text and to handle micro-planning (i.e., sentence segmentation and ordering, referring expression generation, aggregation); the goal of the task is to generate texts starting from 1 to 7 input triples which have entities in common (so the input is actually a connected Knowledge Graph). The dataset contains about 17,000 triple sets and 45,000 crowdsourced texts in English, and 7,000 triples sets and 19,000 crowdsourced texts in Russian. A challenging test set section with entities and/or properties that have not been seen at training time is available.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/web_nlg')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/web_nlg).
#### website
[Website](https://webnlg-challenge.loria.fr/)
#### paper
[First Dataset Release](http://www.aclweb.org/anthology/P17-1017), [WebNLG Challenge 2017 Report](https://www.aclweb.org/anthology/W17-3518/), [WebNLG Challenge 2020 Report](https://webnlg-challenge.loria.fr/files/2020.webnlg-papers.7.pdf)
#### authors
The principle curator of the dataset is Anastasia Shimorina (Université de Lorraine / LORIA, France). Throughout the WebNLG releases, several people contributed to their construction: Claire Gardent (CNRS / LORIA, France), Shashi Narayan (Google, UK), Laura Perez-Beltrachini (University of Edinburgh, UK), Elena Khasanova, and Thiago Castro Ferreira (Federal University of Minas Gerais, Brazil).
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[Website](https://webnlg-challenge.loria.fr/)
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Gitlab](https://gitlab.com/shimorina/webnlg-dataset)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[First Dataset Release](http://www.aclweb.org/anthology/P17-1017), [WebNLG Challenge 2017 Report](https://www.aclweb.org/anthology/W17-3518/), [WebNLG Challenge 2020 Report](https://webnlg-challenge.loria.fr/files/2020.webnlg-papers.7.pdf)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
Initial release of the dataset:
```
@inproceedings{gardent2017creating,
author = "Gardent, Claire
and Shimorina, Anastasia
and Narayan, Shashi
and Perez-Beltrachini, Laura",
title = "Creating Training Corpora for NLG Micro-Planners",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
year = "2017",
publisher = "Association for Computational Linguistics",
pages = "179--188",
location = "Vancouver, Canada",
doi = "10.18653/v1/P17-1017",
url = "http://www.aclweb.org/anthology/P17-1017"
}
```
The latest version 3.0:
```
@inproceedings{castro-ferreira20:bilin-bi-direc-webnl-shared,
title={The 2020 Bilingual, Bi-Directional WebNLG+ Shared Task Overview and Evaluation Results (WebNLG+ 2020)},
author={Castro Ferreira, Thiago and
Gardent, Claire and
Ilinykh, Nikolai and
van der Lee, Chris and
Mille, Simon and
Moussallem, Diego and
Shimorina, Anastasia},
booktitle = {Proceedings of the 3rd WebNLG Workshop on Natural Language Generation from the Semantic Web (WebNLG+ 2020)},
pages = "55--76",
year = 2020,
address = {Dublin, Ireland (Virtual)},
publisher = {Association for Computational Linguistics}}
```
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
webnlg-challenge@inria.fr
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
yes
#### Leaderboard Link
<!-- info: Provide a link to the leaderboard. -->
<!-- scope: periscope -->
[Website](https://beng.dice-research.org/gerbil/)
#### Leaderboard Details
<!-- info: Briefly describe how the leaderboard evaluates models. -->
<!-- scope: microscope -->
The model outputs are evaluated against the crowdsourced references; the leaderboard reports BLEU-4, METEOR, chrF++, TER, BERTScore and BLEURT scores.
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
yes
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`Russian`, `English`
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
The WebNLG dataset was created to promote the development (_i_) of RDF verbalisers and (_ii_) of microplanners able to handle a wide range of linguistic constructions. The dataset aims at covering knowledge in different domains ("categories"). The same properties and entities can appear in several categories.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Data-to-Text
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
A model should verbalize all and only the provided input triples in natural language.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
Université de Lorraine / LORIA, France, CNRS / LORIA, France, University of Edinburgh, UK, Federal University of Minas Gerais, Brazil
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
The principle curator of the dataset is Anastasia Shimorina (Université de Lorraine / LORIA, France). Throughout the WebNLG releases, several people contributed to their construction: Claire Gardent (CNRS / LORIA, France), Shashi Narayan (Google, UK), Laura Perez-Beltrachini (University of Edinburgh, UK), Elena Khasanova, and Thiago Castro Ferreira (Federal University of Minas Gerais, Brazil).
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
The dataset construction was funded by the French National Research Agency (ANR).
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Simon Mille and Sebastian Gehrmann added the dataset and wrote the data card.
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
See [official documentation](https://webnlg-challenge.loria.fr/docs/).
`entry`: a data instance of the benchmark. Each entry has five attributes: a DBpedia category (`category`), entry ID (`eid`), shape, shape type, and triple set size (`size`).
- `shape`: a string representation of the RDF tree with nested parentheses where `X` is a node (see [Newick tree format](https://en.wikipedia.org/wiki/Newick_format)).
- `shape_type`: a type of the tree shape. We [identify](https://www.aclweb.org/anthology/C16-1141.pdf) three types of tree shapes:
* `chain` (the object of one triple is the subject of the other);
* `sibling` (triples with a shared subject);
* `mixed` (both `chain` and `sibling` types present).
- `eid`: an entry ID. It is unique only within a category and a size.
- `category`: a DBpedia category (Astronaut, City, MusicalWork, Politician, etc.).
- `size`: the number of RDF triples in a set. Ranges from 1 to 7.
Each `entry` has three fields: `originaltripleset`, `modifiedtripleset`, and `lexs`.
`originaltripleset`: a set of RDF triples as extracted from [DBpedia](https://wiki.dbpedia.org/). Each set of RDF triples is a tree. Triples have the subject-predicate-object structure.
`modifiedtripleset`: a set of RDF triples as presented to crowdworkers (for more details on modifications, see below).
Original and modified triples serve different purposes: the original triples — to link data to a knowledge base (DBpedia), whereas the modified triples — to ensure consistency and homogeneity throughout the data. To train models, the modified triples should be used.
`lexs` (shortened for lexicalisations): a natural language text verbalising the triples. Each lexicalisation has two attributes: a comment (`comment`), and a lexicalisation ID (`lid`). By default, comments have the value `good`, except rare cases when they were manually marked as `toFix`. That was done during the corpus creation, when it was seen that a lexicalisation did not exactly match a triple set.
Russian data has additional optional fields comparing to English:
`<dbpedialinks>`: RDF triples extracted from DBpedia between English and Russian entities by means of the property `sameAs`.
`<links>`: RDF triples created manually for some entities to serve as pointers to translators. There are two types of them:
* with `sameAs` (`Spaniards | sameAs | испанцы`)
* with `includes` (`Tomatoes, guanciale, cheese, olive oil | includes | гуанчиале`). Those were mostly created for string literals to translate some parts of them.
Lexicalisations in the Russian WebNLG have a new parameter `lang` (values: `en`, `ru`) because original English texts were kept in the Russian version (see the example above).
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
{
"entry": {
"category": "Company",
"size": "4",
"shape": "(X (X) (X) (X) (X))",
"shape_type": "sibling",
"eid": "Id21",
"lexs": [
{
"comment": "good",
"lex": "Trane, which was founded on January 1st 1913 in La Crosse, Wisconsin, is based in Ireland. It has 29,000 employees.",
"lid": "Id1"
}
],
"modifiedtripleset": [
{
"subject": "Trane",
"property": "foundingDate",
"object": "1913-01-01"
},
{
"subject": "Trane",
"property": "location",
"object": "Ireland"
},
{
"subject": "Trane",
"property": "foundationPlace",
"object": "La_Crosse,_Wisconsin"
},
{
"subject": "Trane",
"property": "numberOfEmployees",
"object": "29000"
}
],
"originaltriplesets": {
"originaltripleset": [
{
"subject": "Trane",
"property": "foundingDate",
"object": "1913-01-01"
},
{
"subject": "Trane",
"property": "location",
"object": "Ireland"
},
{
"subject": "Trane",
"property": "foundationPlace",
"object": "La_Crosse,_Wisconsin"
},
{
"subject": "Trane",
"property": "numberOfEmployees",
"object": "29000"
}
]
}
}
}
```
The XML-formatted example is [here](https://webnlg-challenge.loria.fr/docs/#example).
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
| English (v3.0) | Train | Dev | Test |
|-----------------|--------|-------|-------|
| **triple sets** | 13,211 | 1,667 | 1,779 |
| **texts** | 35,426 | 4,464 | 5,150 |
|**properties** | 372 | 290 | 220 |
| Russian (v3.0) | Train | Dev | Test |
|-----------------|--------|-------|-------|
| **triple sets** | 5,573 | 790 | 1,102 |
| **texts** | 14,239 | 2,026 | 2,780 |
|**properties** | 226 | 115 | 192 |
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
Due to the constrained generation task, this dataset can be used to evaluate very specific and narrow generation capabilities.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
yes
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
yes
#### Difference from other GEM datasets
<!-- info: What else sets this dataset apart from other similar datasets in GEM? -->
<!-- scope: microscope -->
The RDF-triple format is unique to WebNLG.
#### Ability that the Dataset measures
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: periscope -->
surface realization
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
yes
#### GEM Modifications
<!-- info: What changes have been made to he original dataset? -->
<!-- scope: periscope -->
`other`
#### Modification Details
<!-- info: For each of these changes, described them in more details and provided the intended purpose of the modification -->
<!-- scope: microscope -->
No changes to the main content of the dataset. The [version 3.0](https://gitlab.com/shimorina/webnlg-dataset/-/tree/master/release_v3.0) of the dataset is used.
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
yes
#### Split Information
<!-- info: Describe how the new splits were created -->
<!-- scope: periscope -->
23 special test sets for WebNLG were added to the GEM evaluation suite, 12 for English and 11 for Russian.
For both languages, we created subsets of the training and development sets of ~500 randomly selected inputs each. The inputs were sampled proportionally from each category.
Two types of transformations have been applied to WebNLG: (i) input scrambling (English and Russian) and (ii) numerical value replacements (English); in both cases, a subset of about 500 inputs was randomly selected. For (i), the order of the triples was randomly reassigned (each triple kept the same Subject-Property-Object internal order). For (ii), the change was performed respecting the format of the current cardinal value (e.g., alpha, integer, or floating-point) and replacing it with a new random value. The new number is lower-bounded between zero and upper bounded to be within to the highest power of 10 unit for the given value (e.g., replacing 54 would result in a random value between 0-100). Floating values maintain the degree of precision.
For both languages, we did identify different subsets of the test set that we could compare to each other so that we would have a better understanding of the results. There are currently 8 selections that we have made:
Selection 1 (size): input length. This selection corresponds to the number of predicates in the input. By comparing inputs of different lengths, we can see to what extent NLG systems are able to handle different input sizes. The table below provides the relevant frequencies. Please be aware that comparing selections with fewer than 100 items may result in unreliable comparisons.
| Input length | Frequency English | Frequency Russian |
|----------------|-------------------|-------------------|
| 1 | 369 | 254 |
| 2 | 349 | 200 |
| 3 | 350 | 214 |
| 4 | 305 | 214 |
| 5 | 213 | 159 |
| 6 | 114 | 32 |
| 7 | 79 | 29 |
Selection 2 (frequency): seen/unseen single predicates. This selection corresponds to the inputs with only one predicate. We compare which predicates are seen/unseen in the training data. The table below provides the relevant frequencies. Note that the comparison is only valid for English. Not for Russian, since there is only one example of unseen single predicates.
| _ in training | Frequency English | Frequency Russian |
|---------------|-------------------|-------------------|
| Seen | 297 | 253 |
| Unseen | 72 | 1 |
Selection 3 (frequency): seen/unseen combinations of predicates. This selection checks for all combinations of predicates whether that combination has been seen in the training data. For example: if the combination of predicates A and B is seen, that means that there is an input in the training data consisting of two triples, where one triple uses predicate A and the other uses predicate B. If the combination is unseen, then the converse is true. The table below provides the relevant frequencies.
| _ in training | Frequency English | Frequency Russian |
|---------------|-------------------|-------------------|
| unseen | 1295 | 354 |
| seen | 115 | 494 |
Selection 4 (frequency): seen/unseen arguments. This selection checks for all input whether or not all arg1s and arg2s in the input have been seen during the training phase. For this selection, *Seen* is the default. Only if all arg1 instances for a particular input are unseen, do we count the arg1s of the input as unseen. The same holds for arg2. So "seen" here really means that at least some of the arg1s or arg2s are seen in the input. The table below provides the relevant frequencies. Note that the comparison is only valid for English. Not for Russian, since there are very few examples of unseen combinations of predicates.
| Arguments seen in training? | Frequency English | Frequency Russian |
|-----------------------------|-------------------|-------------------|
| both_seen | 518 | 1075 |
| both_unseen | 1177 | 4 |
| arg1_unseen | 56 | 19 |
| arg2_unseen | 28 | 4 |
Selection 5 (shape): repeated subjects. For this selection, the subsets are based on the times a subject is repeated in the input; it only takes into account the maximum number of times a subject is repeated, that is, if in one input a subject appears 3 times and a different subject 2 times, this input will be in the "3_subjects_same' split. Unique_subjects means all subjects are different.
| Max num. of repeated subjects | Frequency English | Frequency Russian |
|-------------------------------|-------------------|-------------------|
| unique_subjects | 453 | 339 |
| 2_subjects_same | 414 | 316 |
| 3_subjects_same | 382 | 217 |
| 4_subjects_same | 251 | 143 |
| 5_subjects_same | 158 | 56 |
| 6_subjects_same | 80 | 19 |
| 7_subjects_same | 41 | 12 |
Selection 6 (shape): repeated objects. Same as for subjects above, but for objects. There are much less cases of repeated objects, so there are only two categories for this selection, unique_objects and some_objects_repeated; for the latter, we have up to 3 coreferring objects in English, and XXX in Russian.
| Max num. of repeated objects | Frequency English | Frequency Russian |
|------------------------------|-------------------|-------------------|
| unique_objects | 1654 | 1099 |
| some_objects_same | 125 | 3 |
Selection 7 (shape): repeated properties. Same as for objects above, but for properties; up to two properties can be the same in English, up to XXX in Russian.
| Max num. of repeated properties | Frequency English | Frequency Russian |
|---------------------------------|-------------------|-------------------|
| unique_properties | 1510 | 986 |
| some_properties_same | 269 | 116 |
Selection 8 (shape): entities that appear both as subject and object. For this selection, we grouped together the inputs in which no entity is found as both subject and object, and on the other side inputs in which one or more entity/ies appear both as subject and as object. We found up to two such entities per input in English, and up to XXX in Russian.
| Max num. of objects and subjects in common | Frequency English | Frequency Russian |
|--------------------------------------------|-------------------|-------------------|
| unique_properties | 1322 | 642 |
| some_properties_same | 457 | 460 |
#### Split Motivation
<!-- info: What aspects of the model's generation capacities were the splits created to test? -->
<!-- scope: periscope -->
Robustness
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
Dataset construction: [main dataset paper](https://www.aclweb.org/anthology/P17-1017/), [RDF triple extraction](https://www.aclweb.org/anthology/C16-1141/), [Russian translation](https://www.aclweb.org/anthology/W19-3706/)
WebNLG Challenge 2017: [webpage](https://webnlg-challenge.loria.fr/challenge_2017/), [paper](https://www.aclweb.org/anthology/W17-3518/)
WebNLG Challenge 2020: [webpage](https://webnlg-challenge.loria.fr/challenge_2020/), [paper](https://webnlg-challenge.loria.fr/files/2020.webnlg-papers.7.pdf)
Enriched version of WebNLG: [repository](https://github.com/ThiagoCF05/webnlg), [paper](https://www.aclweb.org/anthology/W18-6521/)
Related research papers: [webpage](https://webnlg-challenge.loria.fr/research/)
## Previous Results
### Previous Results
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
For both languages, the participating systems are automatically evaluated in a multi-reference scenario. Each English hypothesis is compared to a maximum of 5 references, and each Russian one to a maximum of 7 references. On average, English data has 2.89 references per test instance, and Russian data has 2.52 references per instance.
In a human evaluation, example are uniformly sampled across size of triple sets and the following dimensions are assessed (on MTurk and Yandex.Toloka):
1. Data Coverage: Does the text include descriptions of all predicates presented in the data?
2. Relevance: Does the text describe only such predicates (with related subjects and objects), which are found in the data?
3. Correctness: When describing predicates which are found in the data, does the text mention correct the objects and adequately introduces the subject for this specific predicate?
4. Text Structure: Is the text grammatical, well-structured, written in acceptable English language?
5. Fluency: Is it possible to say that the text progresses naturally, forms a coherent whole and it is easy to understand the text?
For additional information like the instructions, we refer to the original paper.
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Other Evaluation Approaches
<!-- info: What evaluation approaches have others used? -->
<!-- scope: periscope -->
We evaluated a wide range of models as part of the GEM benchmark.
#### Relevant Previous Results
<!-- info: What are the most relevant previous results for this task/dataset? -->
<!-- scope: microscope -->
Results can be found on the [GEM website](https://gem-benchmark.com/results).
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
yes - related tasks
#### Social Impact Observations
<!-- info: Did any of these previous uses result in observations about the social impact of the systems? In particular, has there been work outlining the risks and limitations of the system? Provide links and descriptions here. -->
<!-- scope: microscope -->
We do not foresee any negative social impact in particular from this dataset or task.
Positive outlooks: Being able to generate good quality text from RDF data would permit, e.g., making this data more accessible to lay users, enriching existing text with information drawn from knowledge bases such as DBpedia or describing, comparing and relating entities present in these knowledge bases.
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
yes
#### Links and Summaries of Analysis Work
<!-- info: Provide links to and summaries of works analyzing these biases. -->
<!-- scope: microscope -->
This dataset is created using DBpedia RDF triples which naturally exhibit biases that have been found to exist in Wikipedia such as some forms of, e.g., gender bias.
The choice of [entities](https://gitlab.com/shimorina/webnlg-dataset/-/blob/master/supplementary/entities_dict.json), described by RDF trees, was not controlled. As such, they may contain gender biases; for instance, all the astronauts described by RDF triples are male. Hence, in texts, pronouns _he/him/his_ occur more often. Similarly, entities can be related to the Western culture more often than to other cultures.
#### Are the Language Producers Representative of the Language?
<!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? -->
<!-- scope: periscope -->
In English, the dataset is limited to the language that crowdraters speak. In Russian, the language is heavily biased by the translationese of the translation system that is post-edited.
## Considerations for Using the Data
### PII Risks and Liability
#### Potential PII Risk
<!-- info: Considering your answers to the PII part of the Data Curation Section, describe any potential privacy to the data subjects and creators risks when using the dataset. -->
<!-- scope: microscope -->
There is no PII in this dataset.
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`non-commercial use only`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`public domain`
### Known Technical Limitations
#### Technical Limitations
<!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. -->
<!-- scope: microscope -->
The quality of the crowdsourced references is limited, in particular in terms of fluency/naturalness of the collected texts.
Russian data was machine-translated and then post-edited by crowdworkers, so some examples may still exhibit issues related to bad translations.
#### Unsuited Applications
<!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. -->
<!-- scope: microscope -->
Only a limited number of domains are covered in this dataset. As a result, it cannot be used as a general-purpose realizer.
|
YuxuanZhang888/ColonCancerCTDataset | ---
annotations_creators:
- no-annotation
language_creators:
- other
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 100B<n<1T
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-label-image-classification
pretty_name: ColonCancerCTDataset
tags:
- colon cancer
- medical
- cancer
dataset_info:
features:
- name: image
dtype: image
- name: ImageType
sequence: string
- name: StudyDate
dtype: string
- name: SeriesDate
dtype: string
- name: Manufacturer
dtype: string
- name: StudyDescription
dtype: string
- name: SeriesDescription
dtype: string
- name: PatientSex
dtype: string
- name: PatientAge
dtype: string
- name: PregnancyStatus
dtype: string
- name: BodyPartExamined
dtype: string
splits:
- name: train
num_bytes: 3537157.0
num_examples: 30
download_size: 3538117
dataset_size: 3537157.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://portal.imaging.datacommons.cancer.gov]()
- **Repository:** [https://aws.amazon.com/marketplace/pp/prodview-3bcx7vcebfi2i#resources]()
- **Paper:** [https://aacrjournals.org/cancerres/article/81/16/4188/670283/NCI-Imaging-Data-CommonsNCI-Imaging-Data-Commons]()
### Dataset Summary
The dataset in the focus of this project is a curated subset of the National Cancer Institute Imaging Data Commons (IDC), specifically highlighting CT Colonography images. This specialized dataset will encompass a targeted collection from the broader IDC repository hosted on the AWS Marketplace, which includes diverse cancer imaging data. The images included are sourced from clinical studies worldwide and encompass modalities such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), and Positron Emission Tomography (PET).
In addition to the clinical images, essential metadata that contains patient demographics (sex and pregnancy status) and detailed study descriptions are also included in this dataset, enabling nuanced analysis and interpretation of the imaging data.
### Supported Tasks
The dataset can be utilized for several tasks:
- Developing machine learning models to differentiate between benign and malignant colonic lesions.
- Developing algorithms for Creating precise algorithms for segmenting polyps and other colonic structures.
- Conducting longitudinal studies on cancer progression.
- Assessing the diagnostic accuracy of CT Colonography compared to other imaging modalities in colorectal conditions.
### Languages
English is used for text data like labels and imaging study descriptions.
## Dataset Structure
### Data Instances
The data will follow the structure below:
'''
{
"image": image.png # A CT image,
"ImageType": ['ORIGINAL', 'PRIMARY', 'AXIAL', 'CT_SOM5 SPI'] # A list containing the info of the image,
"StudyDate": "20000101" # Date of the case study,
"SeriesDate": 20000101" # Date of the series,
"Manufacturer": "SIEMENS" # Manufacturer of the device used for imaging,
"StudyDescription": "Abdomen^24ACRIN_Colo_IRB2415-04 (Adult)" # Description of the study,
"SeriesDescription": "Colo_prone 1.0 B30f" # Description of the series,
"PatientSex": "F" # Patient's sex,
"PatientAge": "059Y" # Patient's age,
"PregnancyStatus": "None" # Patient's pregnancy status,
"BodyPartExamined": "COLON" # Body part examined
}
'''
### Data Fields
- image (PIL.PngImagePlugin.PngImageFile): The CT image in PNG format
- ImageType (List(String)): A list containing the info of the image
- StudyDate (String): Date of the case study
- SeriesDate (String): Date of the series study
- Manufacturer (String): Manufacturer of the device used for imaging
- StudyDescription (String): Description of the study
- SeriesDescription (String): Description of the series
- PatientSex (String): Patient's sex
- PatientAge (String): Patient's age
- PregnancyStatus (String): Patient's pregnancy status
- BodyPartExamined (String): The body part examined
### Data Splits
| | train | validation | test |
|-------------------------|------:|-----------:|-----:|
| Average Sentence Length | | | |
## Dataset Creation
### Curation Rationale
The dataset is conceived from the necessity to streamline a vast collection of heterogeneous cancer imaging data to facilitate focused research on colon cancer. By distilling the dataset to specifically include CT Colonography, it addresses the challenge of data accessibility for researchers and healthcare professionals interested in colon cancer. This refinement simplifies the task of obtaining relevant data for developing diagnostic models and potentially improving patient outcomes through early detection. The curation of this focused dataset aims to make data more open and usable for specialists and academics in the field of colon cancer research.
### Source Data
According to [IDC](https://portal.imaging.datacommons.cancer.gov/about/), data are submitted from NCI-funded driving projects and other special selected projects.
### Personal and Sensitive Information
According to [IDC](https://portal.imaging.datacommons.cancer.gov/about/), submitters of data to IDC must ensure that the data have been de-identified for protected health information (PHI).
## Considerations for Using the Data
### Social Impact of Dataset
The dataset tailored for CT Colonography aims to enhance medical research and potentially aid in early detection and treatment of colon cancer. Providing high-quality imaging data empowers the development of diagnostic AI tools, contributing to improved patient care and outcomes. This can have a profound social impact, as timely diagnosis is crucial in treating cancer effectively.
### Discussion of Biases
Given the dataset's focus on CT Colonography, biases may arise from the population demographics represented or the prevalence of certain conditions within the dataset. It is crucial to ensure that the dataset includes diverse cases to mitigate biases in model development and to ensure that AI tools developed using this data are generalizable and equitable in their application.
### Other Known Limitations
The dataset may have limitations in terms of variability and scope, as it focuses solely on CT Colonography. Other modalities and cancer types are not represented, which could limit the breadth of research.
### Licensing Information
https://fairsharing.org/FAIRsharing.0b5a1d
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@article{fedorov2021nci,
title={NCI imaging data commons},
author={Fedorov, Andrey and Longabaugh, William JR and Pot, David
and Clunie, David A and Pieper, Steve and Aerts, Hugo JWL and
Homeyer, Andr{\'e} and Lewis, Rob and Akbarzadeh, Afshin and
Bontempi, Dennis and others},
journal={Cancer research},
volume={81},
number={16},
pages={4188--4193},
year={2021},
publisher={AACR}
}
```
[DOI](https://doi.org/10.1158/0008-5472.CAN-21-0950) |
dmlea/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
dtype: 'null'
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 1660862
num_examples: 500
download_size: 437911
dataset_size: 1660862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manu/gallica_ocr_cleaned | ---
dataset_info:
features:
- name: file
dtype: string
- name: clean_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 44570227
num_examples: 7687
download_size: 25073743
dataset_size: 44570227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MhonkYI/TheRealWorldstudent | ---
license: unknown
---
|
open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1 | ---
pretty_name: Evaluation run of bardsai/jaskier-7b-dpo-v4.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bardsai/jaskier-7b-dpo-v4.1](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T11:50:03.919128](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1/blob/main/results_2024-02-14T11-50-03.919128.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535379230152994,\n\
\ \"acc_stderr\": 0.03198200076203346,\n \"acc_norm\": 0.6529860127972825,\n\
\ \"acc_norm_stderr\": 0.032649840332740133,\n \"mc1\": 0.6132190942472461,\n\
\ \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7592071044175611,\n\
\ \"mc2_stderr\": 0.01411814026868143\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7173869747062338,\n\
\ \"acc_stderr\": 0.0044934958720001085,\n \"acc_norm\": 0.8906592312288388,\n\
\ \"acc_norm_stderr\": 0.0031142850772280296\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.0154808268653743,\n \"acc_norm\"\
: 0.8458715596330275,\n \"acc_norm_stderr\": 0.0154808268653743\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n\
\ \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n\
\ \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n\
\ \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n\
\ \"acc_stderr\": 0.016643307372315872,\n \"acc_norm\": 0.45139664804469276,\n\
\ \"acc_norm_stderr\": 0.016643307372315872\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.01275285834653313,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.01275285834653313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6132190942472461,\n\
\ \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7592071044175611,\n\
\ \"mc2_stderr\": 0.01411814026868143\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272955\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \
\ \"acc_stderr\": 0.012815868296721362\n }\n}\n```"
repo_url: https://huggingface.co/bardsai/jaskier-7b-dpo-v4.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-50-03.919128.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- '**/details_harness|winogrande|5_2024-02-14T11-50-03.919128.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T11-50-03.919128.parquet'
- config_name: results
data_files:
- split: 2024_02_14T11_50_03.919128
path:
- results_2024-02-14T11-50-03.919128.parquet
- split: latest
path:
- results_2024-02-14T11-50-03.919128.parquet
---
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v4.1](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:50:03.919128](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1/blob/main/results_2024-02-14T11-50-03.919128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535379230152994,
"acc_stderr": 0.03198200076203346,
"acc_norm": 0.6529860127972825,
"acc_norm_stderr": 0.032649840332740133,
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7592071044175611,
"mc2_stderr": 0.01411814026868143
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7173869747062338,
"acc_stderr": 0.0044934958720001085,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280296
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.0154808268653743,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.0154808268653743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.016643307372315872,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.016643307372315872
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.01275285834653313,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.01275285834653313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7592071044175611,
"mc2_stderr": 0.01411814026868143
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272955
},
"harness|gsm8k|5": {
"acc": 0.6830932524639879,
"acc_stderr": 0.012815868296721362
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp | ---
pretty_name: Evaluation run of Jingyu6/MergeTest-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Jingyu6/MergeTest-7B-slerp](https://huggingface.co/Jingyu6/MergeTest-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T22:27:10.970794](https://huggingface.co/datasets/open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp/blob/main/results_2024-01-13T22-27-10.970794.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6435586511149554,\n\
\ \"acc_stderr\": 0.03211164826791609,\n \"acc_norm\": 0.6437934877212986,\n\
\ \"acc_norm_stderr\": 0.03276783411526557,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5979568100280714,\n\
\ \"mc2_stderr\": 0.015157800976988994\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600938,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277366\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6698864767974507,\n\
\ \"acc_stderr\": 0.004692926794268468,\n \"acc_norm\": 0.8614817765385382,\n\
\ \"acc_norm_stderr\": 0.003447370972192066\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597542,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597542\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323797,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323797\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n\
\ \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n\
\ \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5979568100280714,\n\
\ \"mc2_stderr\": 0.015157800976988994\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.012652544133186141\n }\n}\n```"
repo_url: https://huggingface.co/Jingyu6/MergeTest-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|arc:challenge|25_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|gsm8k|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hellaswag|10_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T22-27-10.970794.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- '**/details_harness|winogrande|5_2024-01-13T22-27-10.970794.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T22-27-10.970794.parquet'
- config_name: results
data_files:
- split: 2024_01_13T22_27_10.970794
path:
- results_2024-01-13T22-27-10.970794.parquet
- split: latest
path:
- results_2024-01-13T22-27-10.970794.parquet
---
# Dataset Card for Evaluation run of Jingyu6/MergeTest-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Jingyu6/MergeTest-7B-slerp](https://huggingface.co/Jingyu6/MergeTest-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T22:27:10.970794](https://huggingface.co/datasets/open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp/blob/main/results_2024-01-13T22-27-10.970794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6435586511149554,
"acc_stderr": 0.03211164826791609,
"acc_norm": 0.6437934877212986,
"acc_norm_stderr": 0.03276783411526557,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5979568100280714,
"mc2_stderr": 0.015157800976988994
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600938,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277366
},
"harness|hellaswag|10": {
"acc": 0.6698864767974507,
"acc_stderr": 0.004692926794268468,
"acc_norm": 0.8614817765385382,
"acc_norm_stderr": 0.003447370972192066
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597542,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597542
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323797,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5979568100280714,
"mc2_stderr": 0.015157800976988994
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distil-whisper/earnings21 | ---
dataset_info:
config_name: full
features:
- name: audio
dtype: audio
- name: file_id
dtype: string
- name: audio_length
dtype: string
- name: sample_rate
dtype: string
- name: company_name
dtype: string
- name: financial_quarter
dtype: string
- name: sector
dtype: string
- name: speaker_switches
dtype: string
- name: unique_speakers
dtype: string
- name: curator_id
dtype: string
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 778199575.0
num_examples: 44
download_size: 772949298
dataset_size: 778199575.0
configs:
- config_name: full
data_files:
- split: test
path: full/test-*
---
# Dataset Card for "earnings21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lorinma/IE_Sharegpt_zh | ---
language:
- zh
---
信息抽取的数据源来源于CoNLL ACE casis DuEE People Daily DuIE等
数据集来源于 [InstructIE: A Chinese Instruction-based Information Extraction Dataset] (https://arxiv.org/abs/2305.11527),感谢浙江大学[Cama组](https://github.com/zjunlp/KnowLM)的高质量工作!(btw我还是更喜欢你们老的cama这个名字
虽然数据集都是单轮的,将格式改为Sharegpt的多轮对话形式可以和其他的多轮对话数据集兼容 |
cahya/instructions-et | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 880362.0
num_examples: 1800
- name: test
num_bytes: 48909.0
num_examples: 100
- name: validation
num_bytes: 48909.0
num_examples: 100
download_size: 622841
dataset_size: 978180.0
---
# Dataset Card for "instructions-et"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pa-shk/tmdb_processed | ---
dataset_info:
features:
- name: name
dtype: string
- name: metadata
struct:
- name: adult
dtype: bool
- name: budget
dtype: int64
- name: genres
dtype: string
- name: imdb_id
dtype: string
- name: original_language
dtype: string
- name: original_title
dtype: string
- name: overview
dtype: string
- name: popularity
dtype: float64
- name: production_companies
dtype: string
- name: production_countries
dtype: string
- name: release_date
dtype: string
- name: revenue
dtype: int64
- name: runtime
dtype: int64
- name: spoken_languages
dtype: string
- name: status
dtype: string
- name: tagline
dtype: string
- name: vote_average
dtype: float64
- name: vote_count
dtype: int64
- name: id
dtype: string
- name: embed
sequence: float32
splits:
- name: train
num_bytes: 23949369
num_examples: 6629
download_size: 26213085
dataset_size: 23949369
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Leyhtalas/fabio | ---
license: openrail
---
|
yeeb/C50 | ---
license: openrail
---
## Dataset Description
The dataset is the subset of RCV1. These corpus has already been used in author identification experiments. In the top 50 authors (with respect to total size of articles) were selected. 50 authors of texts labeled with at least one subtopic of the class CCAT(corporate/industrial) were selected.That way, it is attempted to minimize the topic factor in distinguishing among the texts. The training corpus consists of 2,500 texts (50 per author) and the test corpus includes other 2,500 texts (50 per author) non-overlapping with the training texts.
- **Homepage:** https://archive.ics.uci.edu/ml/datasets/Reuter_50_50
- **Repository:** https://archive.ics.uci.edu/ml/datasets/Reuter_50_50
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** |
jingzi/CIMD | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- zh
size_categories:
- 100K<n<1M
---
## Chinese Instruction Multimodal Data (CIMD)
The dataset contains one million Chinese image-text pairs in total, including detailed image captioning and visual question answering.
### Generation Pipeline
* Image source
We randomly sample images from two opensource datasets [Wanjuan](https://github.com/opendatalab/WanJuan1.0) and [Wukong](https://wukong-dataset.github.io/wukong-dataset/)
* Detailed caption generation
We use [Gemini Pro Vision API](https://ai.google.dev/) to generate a detailed description for each image.
* Question-answer pairs generation
Based on the generated caption, we use Gemini api to generate a complex question for the corresbonding image. Then, the detailed answer is generated with existing information by Gemini api again.
For multi-round conversations, the next question is generated with the conversation history.
* Post-processing
To further improve the quaility of the generated instruction data, we discard the image-text pairs with multiple repetitions in the text.
We found some answers generated by Gemini may have hallucination to a certain degree, so we filter the unreasonable generated image-text pairs by asking Gemini again.
### Detailed information of the datasets
| Image Source | #Images | #Captions | #Single-turn QA Pairs | #Multi-turn QA pairs | #Total Image-text Pairs |
| --- | ---: | ---: | ---: | ---: | ---: |
[Wanjuan](https://github.com/opendatalab/WanJuan1.0) |212,326| 200,917| 539,371 |65,005| 805,293
[Wukong](https://wukong-dataset.github.io/wukong-dataset/)| 66,570 |106,368| 107,991 |0| 214,359
total| 278,896| 307,285| 647,362| 65,005| 1,019,652
All datasets can be downloaded [here](https://huggingface.co/datasets/jingzi/CIMD).
We also uploaded all the images as `images.zip`, because we have renamed the original image name during downloading for convenience.
### Samples
* single-turn conversation with detailed caption (image from wanjuan dataset)
```
{
"id": "4fdb831b09ab87f3dfa758cbb3693eb2dec41b7a5cb0eeb55a6eb5faae2c7fb1_2",
"image": "wanjuan/4fdb831b09ab87f3dfa758cbb3693eb2dec41b7a5cb0eeb55a6eb5faae2c7fb1.jpg",
"conversations": [
{
"from": "human",
"value": " 这些屋顶的坡度各不相同,为什么有些屋顶的坡度较大,而有些屋顶的坡度较小呢?\n<image>"
},
{
"from": "gpt",
"value": " 屋顶的坡度与当地的气候条件有关。坡度大的屋顶可以更快地排水,更适合于多雨地区。坡度小的屋顶更适合于少雨地区。"
}
],
"detailed_caption": " 这是一组中国传统民居的屋顶图片。屋顶由青瓦铺成,屋脊上有各种装饰。屋顶的坡度很大,有利于排水。屋顶上还有许多小老虎,用以辟邪。"
}
```
* multi-turn conversation without detailed caption (image from wanjuan dataset)
```
{
"id": "787add42091e8f7f2ec23ab48fcbdf2201859fd7dd9c6ba95ae373aa11a7f9c8_0",
"image": "wanjuan/787add42091e8f7f2ec23ab48fcbdf2201859fd7dd9c6ba95ae373aa11a7f9c8.jpg",
"conversations": [
{
"from": "human",
"value": " 这张图片是一套咖啡店的VI设计,包括了名片、信纸、封套、宣传册。请问这套设计体现了咖啡店的哪些特色?\n<image>"
},
{
"from": "gpt",
"value": " 这套咖啡店的VI设计体现了咖啡店的以下特色:\n\n1. 现代感:这套设计使用了现代感十足的几何图形和线条,给人以时尚、摩登的感觉。\n2. 简约性:这套设计并没有采用过多的装饰元素,而是以简约的风格为主,给人以干净、清爽的感觉。\n3. 自然感:这套设计使用了大量的留白,并且在名片和信纸上都运用了大面积的绿色,给人以自然、清新之感。\n4. 趣味性:这套设计在一些细节之处加入了一些趣味性的元素,比如在名片的背面加入了一个小插图,在信纸的抬头处加入了一个咖啡杯的图案,这些小细节为这套设计增添了些许趣味。"
},
{
"from": "human",
"value": " 好的,我根据你的描述,给你设计了一个新的问题:\n\n这套咖啡店的VI设计使用了哪些颜色?这些颜色分别代表了什么?"
},
{
"from": "gpt", "value": " 这套咖啡店的VI设计使用了黄色、绿色和黑色。黄色代表着阳光、活力和热情;绿色代表着自然、清新和健康;黑色代表着沉稳、高贵和典雅。这三种颜色搭配在一起,给人以一种非常舒适的感觉,同时也很好地体现了咖啡店的特色。"
}
],
"detailed_caption": ""
}
```
* single-turn conversation with detailed caption (image from wukong dataset)
```
{
"id": "b97b0ec2d43f84d2c424689db9592d26_1",
"image": "wukong/b97b0ec2d43f84d2c424689db9592d26.jpg",
"conversations": [
{
"from": "human",
"value": " 为什么熊猫头上会有一个蝴蝶结?\n<image>"
},
{
"from": "gpt",
"value": " 图片是一张熊猫头的表情包,它头上有一个粉色的蝴蝶结。在现实生活中,熊猫头上一般不会有蝴蝶结。但在图片中,熊猫头上有一个蝴蝶结,这可能是为了装饰,或者是为了搞笑。"
}
],
"detailed_caption": " 这张图片是一张熊猫头的表情包,它头上有一个粉色的蝴蝶结,表情看起来很滑稽。图片的文字是“谢谢你啊,这么忙还来亲自骂我”,语气中带着一丝无奈和自嘲。"
}
```
|
Vaibhav9401/toxic75k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: llama_finetune_text
dtype: string
splits:
- name: train
num_bytes: 61395720
num_examples: 72313
download_size: 11452836
dataset_size: 61395720
---
# Dataset Card for "toxic75k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/stack-v2-python-with-content-chunk1 | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: directory_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_language
dtype: string
- name: files
list:
- name: alpha_fraction
dtype: float64
- name: alphanum_fraction
dtype: float64
- name: avg_line_length
dtype: float64
- name: blob_id
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: is_generated
dtype: bool
- name: is_vendor
dtype: bool
- name: language
dtype: string
- name: length_bytes
dtype: int64
- name: license_type
dtype: string
- name: max_line_length
dtype: int64
- name: num_lines
dtype: int64
- name: path
dtype: string
- name: repo_name
dtype: string
- name: src_encoding
dtype: string
- name: text
dtype: string
- name: num_files
dtype: int64
splits:
- name: train
num_bytes: 144356548766
num_examples: 2984968
download_size: 51814563482
dataset_size: 144356548766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AhmadMustafa/Urdu-Instruct-News-Category-Classification | ---
language:
- ur
size_categories:
- 100K<n<1M
task_categories:
- text-classification
- text-generation
- text2text-generation
pretty_name: Instruct News Category Classification Urdu
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
- name: template_lang
sequence: string
splits:
- name: train
num_bytes: 261557910
num_examples: 100674
- name: test
num_bytes: 29048190
num_examples: 11187
download_size: 116629288
dataset_size: 290606100
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "Urdu-Instruct-News-Category-Classification"
This Dataset is converted from the [original dataset](https://data.mendeley.com/datasets/834vsxnb99/3) by Khalid Hussain, Nimra Mughal, Irfan Ali, Saif Hassan, Sher Muhammad Daudpota.
## Task:
Generate the News Paragraph, and classify the news category from it.
## Split Size:
- train: 100674
- test: 11187
## Prompt Template (In Urdu):
Random.choice b.w these 2. The first template is **template_id** 1 in the dataset, second template is **template_id** 2 in the dataset.
```
["""اس پیراگراف (خبروں) کو مندرجہ ذیل درجوں (کیٹگریز) میں سے ایک کیٹگری میں ڈالیں.
درجے:
- کھیل
- تفریح،
- کاروبار و معیشت،
- سائنس اور ٹیکنالوجی
پیراگراف: {}""",
"""دیے گی خبروں کو ایک لفظ یا درجہ (کیٹگری) میں بیان کریں.
درجے:
- کھیل
- تفریح،
- کاروبار و معیشت،
- سائنس اور ٹیکنالوجی
جملے: {}"""]
```
<b>Translation</b>:
```
[Classify the given paragraph (news article) into one of the following categories:
1. Sports
2. Entertainment
3. Business & Economics
4. Science & Technology
paragraph: {},
Classify the given news paragraph into one of the following categories:
1. Sports
2. Entertainment
3. Business & Economics
4. Science & Technology
sentences: {}]
```
## Completion Template (In Urdu)
```
یہ دی گی خبریں {} سے تعلق رکھتی ہیں
```
<b>Translation</b>:
```
The given news belong to the {category}
``` |
Praghxx/Bryann | ---
license: openrail
---
|
dtthanh/200_question_rag | ---
dataset_info:
features:
- name: question
dtype: string
- name: contexts
dtype: string
- name: ground_truths
sequence: string
splits:
- name: train
num_bytes: 222669
num_examples: 200
download_size: 42096
dataset_size: 222669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.