datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
krisha05/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2309233
num_examples: 1000
download_size: 1371750
dataset_size: 2309233
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shikii2/may | ---
license: openrail
---
|
open-llm-leaderboard/details_glaiveai__glaive-coder-7b | ---
pretty_name: Evaluation run of glaiveai/glaive-coder-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [glaiveai/glaive-coder-7b](https://huggingface.co/glaiveai/glaive-coder-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_glaiveai__glaive-coder-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T06:05:57.317368](https://huggingface.co/datasets/open-llm-leaderboard/details_glaiveai__glaive-coder-7b/blob/main/results_2023-10-29T06-05-57.317368.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00388003355704698,\n\
\ \"em_stderr\": 0.0006366682825519943,\n \"f1\": 0.055515939597315614,\n\
\ \"f1_stderr\": 0.0014057901382845646,\n \"acc\": 0.32489335335120895,\n\
\ \"acc_stderr\": 0.009957962270331142\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825519943,\n\
\ \"f1\": 0.055515939597315614,\n \"f1_stderr\": 0.0014057901382845646\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.052312357846853674,\n \
\ \"acc_stderr\": 0.006133057708959239\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5974743488555643,\n \"acc_stderr\": 0.013782866831703044\n\
\ }\n}\n```"
repo_url: https://huggingface.co/glaiveai/glaive-coder-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T06_05_57.317368
path:
- '**/details_harness|drop|3_2023-10-29T06-05-57.317368.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T06-05-57.317368.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T06_05_57.317368
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-05-57.317368.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-05-57.317368.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T06_05_57.317368
path:
- '**/details_harness|winogrande|5_2023-10-29T06-05-57.317368.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T06-05-57.317368.parquet'
- config_name: results
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- results_2023-09-22T05-33-12.124557.parquet
- split: 2023_10_29T06_05_57.317368
path:
- results_2023-10-29T06-05-57.317368.parquet
- split: latest
path:
- results_2023-10-29T06-05-57.317368.parquet
---
# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/glaiveai/glaive-coder-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [glaiveai/glaive-coder-7b](https://huggingface.co/glaiveai/glaive-coder-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_glaiveai__glaive-coder-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T06:05:57.317368](https://huggingface.co/datasets/open-llm-leaderboard/details_glaiveai__glaive-coder-7b/blob/main/results_2023-10-29T06-05-57.317368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00388003355704698,
"em_stderr": 0.0006366682825519943,
"f1": 0.055515939597315614,
"f1_stderr": 0.0014057901382845646,
"acc": 0.32489335335120895,
"acc_stderr": 0.009957962270331142
},
"harness|drop|3": {
"em": 0.00388003355704698,
"em_stderr": 0.0006366682825519943,
"f1": 0.055515939597315614,
"f1_stderr": 0.0014057901382845646
},
"harness|gsm8k|5": {
"acc": 0.052312357846853674,
"acc_stderr": 0.006133057708959239
},
"harness|winogrande|5": {
"acc": 0.5974743488555643,
"acc_stderr": 0.013782866831703044
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Gabriel1322/teste11 | ---
license: openrail
---
|
chengyenhsieh/TAO-Amodal | ---
task_categories:
- object-detection
license: mit
tags:
- computer vision
- amodal-tracking
- object-tracking
- amodal-perception
configs:
- config_name: default
data_files:
- split: train
path: "amodal_annotations/train.json"
- split: validation
path: "amodal_annotations/validation.json"
- split: test
path: "amodal_annotations/test.json"
extra_gated_prompt: "To download the AVA and HACS videos you have to agree to terms and conditions."
extra_gated_fields:
You will use the Datasets only for non-commercial research and educational purposes.:
type: select
options:
- Yes
- No
You will NOT distribute the Datasets or any parts thereof.:
type: select
options:
- Yes
- No
Carnegie Mellon University makes no representations or warranties regarding the datasets, including but not limited to warranties of non-infringement or fitness for a particular purpose.:
type: select
options:
- Yes
- No
You accept full responsibility for your use of the datasets and shall defend and indemnify Carnegie Mellon University, including its employees, officers and agents, against any and all claims arising from your use of the datasets, including but not limited to your use of any copyrighted videos or images that you may create from the datasets.:
type: select
options:
- Yes
- No
You will treat people appearing in this data with respect and dignity.:
type: select
options:
- Yes
- No
This data comes with no warranty or guarantee of any kind, and you accept full liability.:
type: select
options:
- Yes
- No
extra_gated_heading: "TAO-Amodal VIDEO Request"
extra_gated_button_content: "Request Data"
---
# TAO-Amodal Dataset
<!-- Provide a quick summary of the dataset. -->
Official Source for Downloading the TAO-Amodal Dataset.
[**📙 Project Page**](https://tao-amodal.github.io/) | [**💻 Code**](https://github.com/WesleyHsieh0806/TAO-Amodal) | [**📎 Paper Link**](https://arxiv.org/abs/2312.12433) | [**✏️ Citations**](#citations)
<div align="center">
<a href="https://tao-amodal.github.io/"><img width="95%" alt="TAO-Amodal" src="https://tao-amodal.github.io/static/images/webpage_preview.png"></a>
</div>
</br>
Contact: [🙋🏻♂️Cheng-Yen (Wesley) Hsieh](https://wesleyhsieh0806.github.io/)
## Dataset Description
Our dataset augments the TAO dataset with amodal bounding box annotations for fully invisible, out-of-frame, and occluded objects.
Note that this implies TAO-Amodal also includes modal segmentation masks (as visualized in the color overlays above).
Our dataset encompasses 880 categories, aimed at assessing the occlusion reasoning capabilities of current trackers
through the paradigm of Tracking Any Object with Amodal perception (TAO-Amodal).
### Dataset Download
1. Download with git:
```bash
git lfs install
git clone git@hf.co:datasets/chengyenhsieh/TAO-Amodal
```
- Download with [`python`](https://huggingface.co/docs/huggingface_hub/guides/download#download-files-from-the-hub):
```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="chengyenhsieh/TAO-Amodal")
```
2. Unzip all videos:
Modify `dataset_root` in [unzip_video.py](./unzip_video.py) and run:
```bash
python unzip_video.py
```
3. Download video frames through git (optional):
You can either download the frames following the instructions [here](https://motchallenge.net/tao_download.php) (recommended) or modify our provided [script](./download_frames.sh) and run
```bash
bash download_frames.sh
```
## 📚 Dataset Structure
The dataset should be structured like this:
```bash
TAO-Amodal
├── frames
│ └── train
│ ├── ArgoVerse
│ ├── BDD
│ ├── Charades
│ ├── HACS
│ ├── LaSOT
│ └── YFCC100M
├── amodal_annotations
│ ├── train/validation/test.json
│ ├── train_lvis_v1.json
│ └── validation_lvis_v1.json
├── example_output
│ └── prediction.json
├── BURST_annotations
│ ├── train
│ └── train_visibility.json
│ ...
```
## 📚 File Descriptions
| File Name | Description |
| -------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| train/validation/test.json | Formal annotation files. We use these annotations for visualization. Categories include those in [lvis](https://www.lvisdataset.org/) v0.5 and freeform categories. |
| train_lvis_v1.json | We use this file to train our [amodal-expander](https://tao-amodal.github.io/index.html#Amodal-Expander), treating each image frame as an independent sequence. Categories are aligned with those in lvis v1.0. |
| validation_lvis_v1.json | We use this file to evaluate our [amodal-expander](https://tao-amodal.github.io/index.html#Amodal-Expander). Categories are aligned with those in lvis v1.0. |
| prediction.json | Example output json from amodal-expander. Tracker predictions should be structured like this file to be evaluated with our [evaluation toolkit](https://github.com/WesleyHsieh0806/TAO-Amodal?tab=readme-ov-file#bar_chart-evaluation). |
| BURST_annotations/XXX.json | Modal mask annotations from [BURST dataset](https://github.com/Ali2500/BURST-benchmark) with our heuristic visibility attributes. We provide these files for the convenience of visualization |
### Annotation and Prediction Format
Our annotations are structured similarly as [TAO](https://github.com/TAO-Dataset/tao/blob/master/tao/toolkit/tao/tao.py#L4) with some modifications.
Annotations:
```bash
Annotation file format:
{
"info" : info,
"images" : [image],
"videos": [video],
"tracks": [track],
"annotations" : [annotation],
"categories": [category],
"licenses" : [license],
}
annotation: {
"id": int,
"image_id": int,
"track_id": int,
"bbox": [x,y,width,height],
"area": float,
# Redundant field for compatibility with COCO scripts
"category_id": int,
"video_id": int,
# Other important attributes for evaluation on TAO-Amodal
"amodal_bbox": [x,y,width,height],
"amodal_is_uncertain": bool,
"visibility": float, (0.~1.0)
}
image, info, video, track, category, licenses, : Same as TAO
```
Predictions should be structured as:
```bash
[{
"image_id" : int,
"category_id" : int,
"bbox" : [x,y,width,height],
"score" : float,
"track_id": int,
"video_id": int
}]
```
Refer to the instructions of [TAO dataset](https://github.com/TAO-Dataset/tao/blob/master/docs/evaluation.md) for further details
## 📺 Example Sequences
Check [here](https://tao-amodal.github.io/#TAO-Amodal) for more examples and [here](https://github.com/WesleyHsieh0806/TAO-Amodal?tab=readme-ov-file#artist-visualization) for visualization code.
[<img src="https://tao-amodal.github.io/static/images/car_and_bus.png" width="50%">](https://tao-amodal.github.io/dataset.html "tao-amodal")
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```
@misc{hsieh2023tracking,
title={Tracking Any Object Amodally},
author={Cheng-Yen Hsieh and Tarasha Khurana and Achal Dave and Deva Ramanan},
year={2023},
eprint={2312.12433},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<details>
<summary>Please also cite <a href="https://taodataset.org/">TAO</a> and <a href="https://github.com/Ali2500/BURST-benchmark">BURST</a> dataset if you use our dataset</summary>
```
@inproceedings{dave2020tao,
title={Tao: A large-scale benchmark for tracking any object},
author={Dave, Achal and Khurana, Tarasha and Tokmakov, Pavel and Schmid, Cordelia and Ramanan, Deva},
booktitle={Computer Vision--ECCV 2020: 16th European Conference, Glasgow, UK, August 23--28, 2020, Proceedings, Part V 16},
pages={436--454},
year={2020},
organization={Springer}
}
@inproceedings{athar2023burst,
title={Burst: A benchmark for unifying object recognition, segmentation and tracking in video},
author={Athar, Ali and Luiten, Jonathon and Voigtlaender, Paul and Khurana, Tarasha and Dave, Achal and Leibe, Bastian and Ramanan, Deva},
booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
pages={1674--1683},
year={2023}
}
```
</details>
|
patrickshitou/ArcMMLU | ---
license: cc-by-nc-sa-4.0
---
## Introduction
[ArcMMLU](https://github.com/stzhang-patrick/ArcMMLU) is a Chinese benchmark specifically designed for evaluating LLMs on Library & Information Science (LIS). It aims to evaluate the knowledge and reasoning capabilities of LLMs in the LIS academic field, which covers four key sub-areas: Archival Science, Data Science, Library Science, and Information Science. Please refer to our paper for more information [ArcMMLU: A Library and Information Science Benchmark for Large Language Models](https://arxiv.org/abs/2311.18658)
It is important to note that the name ArcMMLU is derived from our previous large language model research project—[ArcGPT](https://arxiv.org/abs/2307.14852), which was primarily focused on Archival Science. Later, our research scope expanded from Archival Science to a broader field of information management, but we retained the name ArcMMLU. Therefore, ArcMMLU is not just an evaluation benchmark for Archival Science; it is a comprehensive evaluation dataset for the entire LIS discipline.
For the sake of convenience, ArcMMLU adopts the same data format as CMMLU. Furthermore, based on the CMMLU project, we provide evaluation code. For models that have been evaluated on CMMLU, conducting an evaluation on ArcMMLU will be pretty straightforward. Special thanks to the [CMMLU---Chinese Multi-Task Language Understanding Evaluation](https://github.com/haonan-li/CMMLU) project for its contribution to the evaluation of Chinese LLMs. We hope that ArcMMLU can serve as a powerful supplement in specialized fields, bringing more detail and depth to the evaluation of Chinese LLMs.
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_111 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1154957256.0
num_examples: 226818
download_size: 1175930149
dataset_size: 1154957256.0
---
# Dataset Card for "chunk_111"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anujpaudel/linge-ping-1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6525269.0
num_examples: 159
download_size: 6003377
dataset_size: 6525269.0
---
# Dataset Card for "linge-ping-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ASR-HypR/LibriSpeech_withoutLM | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev_clean
path: data/dev_clean-*
- split: dev_other
path: data/dev_other-*
- split: test_clean
path: data/test_clean-*
- split: test_other
path: data/test_other-*
dataset_info:
features:
- name: utt_id
dtype: string
- name: hyps
sequence: string
- name: att_score
sequence: float64
- name: ctc_score
sequence: float64
- name: score
sequence: float64
- name: ref
dtype: string
splits:
- name: train
num_bytes: 2945538729
num_examples: 281231
- name: dev_clean
num_bytes: 18638182
num_examples: 2703
- name: dev_other
num_bytes: 17703758
num_examples: 2864
- name: test_clean
num_bytes: 18159132
num_examples: 2620
- name: test_other
num_bytes: 18159497
num_examples: 2939
download_size: 721788345
dataset_size: 3018199298
---
# Dataset Card for "LibriSpeech_withoutLM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WasamiKirua/haiku-ita | ---
license: apache-2.0
task_categories:
- text-generation
language:
- it
tags:
- haiku
- japan
- synthetic
---
<img src="https://i.postimg.cc/nhfZs6nC/00045-4085794636.png" alt="haiku" border="0" width="1024px">
# Italian Haiku in ShareGpt format
## Dataset Summary
The dataset contains haiku generated in italian, following specific instructions and rules for the Italian language
## Citation (Prompts)
https://huggingface.co/datasets/davanstrien/haiku_prompts has been translated using chatgpt 3.5 turbo. Each and every haiku has been then generated using the following prompt:
## Haiku Generation
The following prompt has been used:
```
Crea un haiku in italiano che segua queste linee guida specifiche: Struttura e Metrica: Componi un poema breve di tre versi.
Cerca di avvicinarti a una struttura metrica di 5-7-5 sillabe per verso,ma sentiti libero di adattare leggermente il conteggio delle sillabe per mantenere l'armonia
e la naturalità del linguaggio italiano.
Elemento Stagionale (Kigo): Includi nel tuo haiku un riferimento chiaro a una delle quattro stagioni (primavera, estate, autunno, inverno).
Questo può essere fatto attraverso l'uso di immagini naturali, parole o concetti che evocano specificamente quel periodo dell'anno.
Taglio (Kireji): Usa una forma di pausa, come la punteggiatura (virgola, punto e virgola, punto) o un cambio di immagine o tono tra i versi,
per creare un momento di riflessione o sorpresa. Questa pausa dovrebbe servire a dividere il poema in due parti che, pur essendo distinte,
rimangono connesse in significato o emozione.
Semplicità ed Essenzialità: Concentrati su immagini e concetti semplici, preferibilmente legati alla natura o a momenti quotidiani,
che rivelino qualcosa di più profondo sulla condizione umana, sulla natura o sulla spiritualità.
Ogni parola deve essere scelta con cura per la sua capacità di evocare immagini vivide e significati ricchi.
Evita Rime e Metafore Complesse: Mantieni il linguaggio diretto e privo di rime forzate o di complesse figure retoriche.
L'haiku dovrebbe preferire la chiarezza e l'immediatezza, con un focus sulla potenza evocativa delle immagini naturali e quotidiane.
Momento Istantaneo: Cerca di catturare l'essenza di un attimo fugace, offrendo una visione o un'osservazione che,
pur nella sua brevità, apre a riflessioni più ampie o universali.
Originalità e Personalità: Lascia che la tua voce unica traspaia nell'haiku, esplorando temi,
immagini e emozioni che ti sono personali o che ti colpiscono particolarmente.
Ricorda che, nonostante le regole, l'haiku è un'espressione artistica soggettiva e personale.
```
|
open-llm-leaderboard/details_mlabonne__Mistralpaca-7B | ---
pretty_name: Evaluation run of mlabonne/Mistralpaca-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/Mistralpaca-7B](https://huggingface.co/mlabonne/Mistralpaca-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Mistralpaca-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T17:05:59.631775](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Mistralpaca-7B/blob/main/results_2024-03-22T17-05-59.631775.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5938237050372654,\n\
\ \"acc_stderr\": 0.033175671534551573,\n \"acc_norm\": 0.59950747855604,\n\
\ \"acc_norm_stderr\": 0.03385088815881686,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720116,\n \"mc2\": 0.5317344352690491,\n\
\ \"mc2_stderr\": 0.015954565977647522\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6468830910177256,\n\
\ \"acc_stderr\": 0.00476961882919651,\n \"acc_norm\": 0.8343955387373033,\n\
\ \"acc_norm_stderr\": 0.003709654977628468\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"\
acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868585,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868585\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911498,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911498\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.01817511051034357,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.01817511051034357\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842538,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765407,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765407\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n\
\ \"acc_stderr\": 0.015694238967737386,\n \"acc_norm\": 0.32737430167597764,\n\
\ \"acc_norm_stderr\": 0.015694238967737386\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297243,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297243\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948854,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948854\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284073,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284073\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811945,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811945\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6013071895424836,\n \"acc_stderr\": 0.01980828131744984,\n \
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.01980828131744984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720116,\n \"mc2\": 0.5317344352690491,\n\
\ \"mc2_stderr\": 0.015954565977647522\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759979\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33131159969673996,\n \
\ \"acc_stderr\": 0.012964999679688664\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/Mistralpaca-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-05-59.631775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-05-59.631775.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- '**/details_harness|winogrande|5_2024-03-22T17-05-59.631775.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T17-05-59.631775.parquet'
- config_name: results
data_files:
- split: 2024_03_22T17_05_59.631775
path:
- results_2024-03-22T17-05-59.631775.parquet
- split: latest
path:
- results_2024-03-22T17-05-59.631775.parquet
---
# Dataset Card for Evaluation run of mlabonne/Mistralpaca-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Mistralpaca-7B](https://huggingface.co/mlabonne/Mistralpaca-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Mistralpaca-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T17:05:59.631775](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Mistralpaca-7B/blob/main/results_2024-03-22T17-05-59.631775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5938237050372654,
"acc_stderr": 0.033175671534551573,
"acc_norm": 0.59950747855604,
"acc_norm_stderr": 0.03385088815881686,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720116,
"mc2": 0.5317344352690491,
"mc2_stderr": 0.015954565977647522
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6468830910177256,
"acc_stderr": 0.00476961882919651,
"acc_norm": 0.8343955387373033,
"acc_norm_stderr": 0.003709654977628468
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868585,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868585
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911498,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911498
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.01817511051034357,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.01817511051034357
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842538,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765407,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765407
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737386,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737386
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297243,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297243
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948854,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948854
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284073,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284073
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811945,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811945
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.01980828131744984,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.01980828131744984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.0294752502360172,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.0294752502360172
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720116,
"mc2": 0.5317344352690491,
"mc2_stderr": 0.015954565977647522
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759979
},
"harness|gsm8k|5": {
"acc": 0.33131159969673996,
"acc_stderr": 0.012964999679688664
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
huggingartists/suicideoscope | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/suicideoscope"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.052767 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/86b0ba099a6797bab3deeba685f3dbc2.800x800x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/suicideoscope">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Suicideoscope</div>
<a href="https://genius.com/artists/suicideoscope">
<div style="text-align: center; font-size: 14px;">@suicideoscope</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/suicideoscope).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/suicideoscope")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|11| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/suicideoscope")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Eduardovco/b | ---
license: openrail
---
|
AdapterOcean/code_instructions_standardized_cluster_14_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8205184
num_examples: 5311
download_size: 4004314
dataset_size: 8205184
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_14_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gingercake01/stt0411largev2 | ---
license: mit
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2251400392
num_examples: 2344
- name: test
num_bytes: 281424592
num_examples: 293
- name: valid
num_bytes: 281424232
num_examples: 293
download_size: 446861823
dataset_size: 2814249216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
gokulraj/hindi | ---
dataset_info:
features:
- name: English
dtype: string
- name: 'Hinglish '
dtype: string
- name: ' combined'
dtype: string
splits:
- name: train
num_bytes: 628
num_examples: 3
download_size: 3143
dataset_size: 628
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nitinbhayana/40_spell_error_data_v1.3 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 31962
num_examples: 40
download_size: 17465
dataset_size: 31962
---
# Dataset Card for "40_spell_error_data_v1.3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlp-pw/disaster_tweets | ---
dataset_info:
features:
- name: tweet_text
dtype: string
- name: disaster_type
dtype: int64
splits:
- name: train
num_bytes: 16972808
num_examples: 107359
download_size: 10414888
dataset_size: 16972808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
E1010836/cegid | ---
license: openrail
---
#23/09/02 20:50
Test dataset files created based on the questions from Vianney.
Only two questions.
Train = Test.
|
iamroot/chat_formatted_examples | ---
license: mit
---
|
Balajb/test-bala | ---
license: other
---
|
open-llm-leaderboard/details_mlabonne__Zebrafish-7B | ---
pretty_name: Evaluation run of mlabonne/Zebrafish-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/Zebrafish-7B](https://huggingface.co/mlabonne/Zebrafish-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Zebrafish-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T17:41:35.346297](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Zebrafish-7B/blob/main/results_2024-04-02T17-41-35.346297.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.650261490863648,\n\
\ \"acc_stderr\": 0.03205613567290235,\n \"acc_norm\": 0.6493283120698816,\n\
\ \"acc_norm_stderr\": 0.03273021626386405,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7791807541368658,\n\
\ \"mc2_stderr\": 0.01371306311341921\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7164907388966342,\n\
\ \"acc_stderr\": 0.004497803024345146,\n \"acc_norm\": 0.891256721768572,\n\
\ \"acc_norm_stderr\": 0.0031068060075356255\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533133,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7791807541368658,\n\
\ \"mc2_stderr\": 0.01371306311341921\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479696\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6990144048521607,\n \
\ \"acc_stderr\": 0.012634504465211187\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/Zebrafish-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|arc:challenge|25_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|gsm8k|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hellaswag|10_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-41-35.346297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T17-41-35.346297.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- '**/details_harness|winogrande|5_2024-04-02T17-41-35.346297.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T17-41-35.346297.parquet'
- config_name: results
data_files:
- split: 2024_04_02T17_41_35.346297
path:
- results_2024-04-02T17-41-35.346297.parquet
- split: latest
path:
- results_2024-04-02T17-41-35.346297.parquet
---
# Dataset Card for Evaluation run of mlabonne/Zebrafish-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Zebrafish-7B](https://huggingface.co/mlabonne/Zebrafish-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Zebrafish-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T17:41:35.346297](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Zebrafish-7B/blob/main/results_2024-04-02T17-41-35.346297.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.650261490863648,
"acc_stderr": 0.03205613567290235,
"acc_norm": 0.6493283120698816,
"acc_norm_stderr": 0.03273021626386405,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7791807541368658,
"mc2_stderr": 0.01371306311341921
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.7164907388966342,
"acc_stderr": 0.004497803024345146,
"acc_norm": 0.891256721768572,
"acc_norm_stderr": 0.0031068060075356255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533133,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7791807541368658,
"mc2_stderr": 0.01371306311341921
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479696
},
"harness|gsm8k|5": {
"acc": 0.6990144048521607,
"acc_stderr": 0.012634504465211187
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dimitars/doctorai-dataset | ---
license: openrail
---
|
tvergho/oasst_prometheus_eval_5_zephyrsft_lvl1_kto | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
- name: is_reference
dtype: bool
splits:
- name: train
num_bytes: 14278207
num_examples: 13415
download_size: 6859964
dataset_size: 14278207
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_prometheus_eval_5_zephyrsft_lvl1_kto"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidgasquez/spain_aemet_historical_weather | ---
dataset_info:
features:
- name: fecha
dtype: timestamp[us]
- name: indicativo
dtype: string
- name: nombre
dtype: string
- name: provincia
dtype: string
- name: latitud
dtype: int64
- name: longitud
dtype: int64
- name: altitud
dtype: int32
- name: tmed
dtype: float32
- name: prec
dtype: float32
- name: tmin
dtype: float32
- name: horatmin
dtype: string
- name: tmax
dtype: float32
- name: horatmax
dtype: string
- name: dir
dtype: float64
- name: velmedia
dtype: float32
- name: racha
dtype: float32
- name: horaracha
dtype: string
- name: presMax
dtype: float32
- name: horaPresMax
dtype: string
- name: presMin
dtype: float32
- name: horaPresMin
dtype: string
- name: hrMedia
dtype: float64
- name: hrMax
dtype: float64
- name: horaHrMax
dtype: string
- name: hrMin
dtype: float64
- name: horaHrMin
dtype: string
- name: sol
dtype: float32
splits:
- name: main
num_bytes: 1177713777
num_examples: 6249425
download_size: 365118630
dataset_size: 1177713777
configs:
- config_name: default
data_files:
- split: main
path: data/main-*
---
|
open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3 | ---
pretty_name: Evaluation run of SC99/Mistral-7B-privatemix-ia3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC99/Mistral-7B-privatemix-ia3](https://huggingface.co/SC99/Mistral-7B-privatemix-ia3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T23:12:46.191755](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3/blob/main/results_2024-01-28T23-12-46.191755.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6477608633784878,\n\
\ \"acc_stderr\": 0.03220667400265738,\n \"acc_norm\": 0.6471674562866768,\n\
\ \"acc_norm_stderr\": 0.032886180571357726,\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7012569167721324,\n\
\ \"mc2_stderr\": 0.01507953669061273\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523205\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7150965943039235,\n\
\ \"acc_stderr\": 0.004504459553909765,\n \"acc_norm\": 0.8868751244771957,\n\
\ \"acc_norm_stderr\": 0.0031609804549511764\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n\
\ \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n\
\ \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977945,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977945\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323385,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323385\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940708,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233278,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233278\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7012569167721324,\n\
\ \"mc2_stderr\": 0.01507953669061273\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8666140489344909,\n \"acc_stderr\": 0.009555448026422974\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6664139499620925,\n \
\ \"acc_stderr\": 0.012987282131410809\n }\n}\n```"
repo_url: https://huggingface.co/SC99/Mistral-7B-privatemix-ia3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|arc:challenge|25_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|gsm8k|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hellaswag|10_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-46.191755.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- '**/details_harness|winogrande|5_2024-01-28T23-12-46.191755.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T23-12-46.191755.parquet'
- config_name: results
data_files:
- split: 2024_01_28T23_12_46.191755
path:
- results_2024-01-28T23-12-46.191755.parquet
- split: latest
path:
- results_2024-01-28T23-12-46.191755.parquet
---
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia3](https://huggingface.co/SC99/Mistral-7B-privatemix-ia3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T23:12:46.191755](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3/blob/main/results_2024-01-28T23-12-46.191755.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6477608633784878,
"acc_stderr": 0.03220667400265738,
"acc_norm": 0.6471674562866768,
"acc_norm_stderr": 0.032886180571357726,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7012569167721324,
"mc2_stderr": 0.01507953669061273
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523205
},
"harness|hellaswag|10": {
"acc": 0.7150965943039235,
"acc_stderr": 0.004504459553909765,
"acc_norm": 0.8868751244771957,
"acc_norm_stderr": 0.0031609804549511764
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977945,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323385,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323385
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940708,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623553,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233278,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233278
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7012569167721324,
"mc2_stderr": 0.01507953669061273
},
"harness|winogrande|5": {
"acc": 0.8666140489344909,
"acc_stderr": 0.009555448026422974
},
"harness|gsm8k|5": {
"acc": 0.6664139499620925,
"acc_stderr": 0.012987282131410809
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2 | ---
pretty_name: Evaluation run of alnrg2arg/test3_sft_16bit_dpo2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alnrg2arg/test3_sft_16bit_dpo2](https://huggingface.co/alnrg2arg/test3_sft_16bit_dpo2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T23:26:26.833091](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2/blob/main/results_2024-02-01T23-26-26.833091.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6525115378138006,\n\
\ \"acc_stderr\": 0.0321441319562114,\n \"acc_norm\": 0.6519969439704142,\n\
\ \"acc_norm_stderr\": 0.032814506961459725,\n \"mc1\": 0.5960832313341493,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.7071252546997986,\n\
\ \"mc2_stderr\": 0.015071123394943023\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.012875929151297046\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7276438956383191,\n\
\ \"acc_stderr\": 0.004442623590846324,\n \"acc_norm\": 0.8902609042023502,\n\
\ \"acc_norm_stderr\": 0.0031192548288489453\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n\
\ \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n\
\ \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863933,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863933\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5960832313341493,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.7071252546997986,\n\
\ \"mc2_stderr\": 0.015071123394943023\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873499\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6747536012130402,\n \
\ \"acc_stderr\": 0.012903904752543917\n }\n}\n```"
repo_url: https://huggingface.co/alnrg2arg/test3_sft_16bit_dpo2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|arc:challenge|25_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|gsm8k|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hellaswag|10_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T23-26-26.833091.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- '**/details_harness|winogrande|5_2024-02-01T23-26-26.833091.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T23-26-26.833091.parquet'
- config_name: results
data_files:
- split: 2024_02_01T23_26_26.833091
path:
- results_2024-02-01T23-26-26.833091.parquet
- split: latest
path:
- results_2024-02-01T23-26-26.833091.parquet
---
# Dataset Card for Evaluation run of alnrg2arg/test3_sft_16bit_dpo2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test3_sft_16bit_dpo2](https://huggingface.co/alnrg2arg/test3_sft_16bit_dpo2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T23:26:26.833091](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2/blob/main/results_2024-02-01T23-26-26.833091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6525115378138006,
"acc_stderr": 0.0321441319562114,
"acc_norm": 0.6519969439704142,
"acc_norm_stderr": 0.032814506961459725,
"mc1": 0.5960832313341493,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.7071252546997986,
"mc2_stderr": 0.015071123394943023
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.012875929151297046
},
"harness|hellaswag|10": {
"acc": 0.7276438956383191,
"acc_stderr": 0.004442623590846324,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.0031192548288489453
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.01661568040100372,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.01661568040100372
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863933,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863933
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5960832313341493,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.7071252546997986,
"mc2_stderr": 0.015071123394943023
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873499
},
"harness|gsm8k|5": {
"acc": 0.6747536012130402,
"acc_stderr": 0.012903904752543917
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mkita/respones_with_log_and_cond_full | ---
dataset_info:
features:
- name: index
dtype: int64
- name: image
sequence:
sequence:
sequence: float64
- name: cond
sequence: float32
splits:
- name: train
num_bytes: 4651029240
num_examples: 295867
download_size: 80797243
dataset_size: 4651029240
---
# Dataset Card for "respones_with_log_and_cond_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nmac/lex_fridman_podcast | ---
task_categories:
- automatic-speech-recognition
- sentence-similarity
language:
- en
tags:
- podcast
- whisper
size_categories:
- 100K<n<1M
---
# Dataset Card for "lex_fridman_podcast"
### Dataset Summary
This dataset contains transcripts from the [Lex Fridman podcast](https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4) (Episodes 1 to 325).
The transcripts were generated using [OpenAI Whisper](https://github.com/openai/whisper) (large model) and made publicly available at: https://karpathy.ai/lexicap/index.html.
### Languages
- English
## Dataset Structure
The dataset contains around 803K entries, consisting of audio transcripts generated from episodes 1 to 325 of the [Lex Fridman podcast](https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4). In addition to the transcript text, the dataset includes other metadata such as episode id and title, guest name, and start and end timestamps for each transcript.
### Data Fields
The dataset schema is as follows:
- **id**: Episode id.
- **guest**: Name of the guest interviewed.
- **title:** Title of the episode.
- **text:** Text of the transcription.
- **start:** Timestamp (`HH:mm:ss.mmm`) indicating the beginning of the trancription.
- **end:** Timestamp (`HH:mm:ss.mmm`) indicating the end of the trancription.
### Source Data
Source data provided by Andrej Karpathy at: https://karpathy.ai/lexicap/index.html
### Contributions
Thanks to [nmac](https://huggingface.co/nmac) for adding this dataset. |
winglian/financial_phrasebank_augmented | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
- name: analysis
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 2470432
num_examples: 2252
download_size: 901227
dataset_size: 2470432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Domenic091/VOCAL-APENAS2 | ---
license: openrail
---
|
Isotonic/dol-phi | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: system_prompt
dtype: string
- name: texts
dtype: string
- name: original_question
dtype: string
splits:
- name: train
num_bytes: 3582653597.7992654
num_examples: 1308843
- name: test
num_bytes: 1194220603.2007346
num_examples: 436282
download_size: 2320020715
dataset_size: 4776874201.0
---
# Dataset Card for "dol-phi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AshtonIsNotHere/biosift-nli | ---
dataset_info:
features:
- name: PMID
dtype: int64
- name: Title
dtype: string
- name: Abstract
dtype: string
- name: Hypothesis
dtype: string
- name: Entailment
dtype: float64
- name: Positive
dtype: bool
- name: Label_idx
dtype: int64
splits:
- name: train
num_bytes: 204165086
num_examples: 112070
- name: validation
num_bytes: 25747188
num_examples: 13958
- name: test
num_bytes: 25719177
num_examples: 13972
download_size: 11692785
dataset_size: 255631451
---
# Dataset Card for "biosift-nli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-hf-gpt-3.5-80k | ---
pretty_name: Evaluation run of JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k](https://huggingface.co/JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-hf-gpt-3.5-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-28T00:04:47.376277](https://huggingface.co/datasets/open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-hf-gpt-3.5-80k/blob/main/results_2024-03-28T00-04-47.376277.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46129015263982415,\n\
\ \"acc_stderr\": 0.034478372271379606,\n \"acc_norm\": 0.466156014223618,\n\
\ \"acc_norm_stderr\": 0.035256245444643106,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.41421125437928746,\n\
\ \"mc2_stderr\": 0.014859051283039066\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5008532423208191,\n \"acc_stderr\": 0.014611369529813283,\n\
\ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.014568245550296358\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5638319059948218,\n\
\ \"acc_stderr\": 0.004948952519517518,\n \"acc_norm\": 0.7577175861382195,\n\
\ \"acc_norm_stderr\": 0.004275886276011774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.03308530426228258,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.03308530426228258\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688932,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688932\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5505050505050505,\n \"acc_stderr\": 0.035441324919479704,\n \"\
acc_norm\": 0.5505050505050505,\n \"acc_norm_stderr\": 0.035441324919479704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658751,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658751\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329882,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329882\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5098039215686274,\n \"acc_stderr\": 0.035086373586305716,\n \"\
acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.035086373586305716\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811225,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811225\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889118,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889118\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6245210727969349,\n\
\ \"acc_stderr\": 0.01731661319718279,\n \"acc_norm\": 0.6245210727969349,\n\
\ \"acc_norm_stderr\": 0.01731661319718279\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.0269150473553698,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.0269150473553698\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.02773125864701199,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.02773125864701199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49382716049382713,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.49382716049382713,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3533246414602347,\n\
\ \"acc_stderr\": 0.01220840821108243,\n \"acc_norm\": 0.3533246414602347,\n\
\ \"acc_norm_stderr\": 0.01220840821108243\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.03036544647727568,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.03036544647727568\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44281045751633985,\n \"acc_stderr\": 0.020095083154577347,\n \
\ \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.020095083154577347\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.033455630703391914,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.033455630703391914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.41421125437928746,\n\
\ \"mc2_stderr\": 0.014859051283039066\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1463229719484458,\n \
\ \"acc_stderr\": 0.009735210557785264\n }\n}\n```"
repo_url: https://huggingface.co/JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|arc:challenge|25_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|gsm8k|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hellaswag|10_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-04-47.376277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-28T00-04-47.376277.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- '**/details_harness|winogrande|5_2024-03-28T00-04-47.376277.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-28T00-04-47.376277.parquet'
- config_name: results
data_files:
- split: 2024_03_28T00_04_47.376277
path:
- results_2024-03-28T00-04-47.376277.parquet
- split: latest
path:
- results_2024-03-28T00-04-47.376277.parquet
---
# Dataset Card for Evaluation run of JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k](https://huggingface.co/JCX-kcuf/Llama-2-7b-hf-gpt-3.5-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-hf-gpt-3.5-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-28T00:04:47.376277](https://huggingface.co/datasets/open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-hf-gpt-3.5-80k/blob/main/results_2024-03-28T00-04-47.376277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46129015263982415,
"acc_stderr": 0.034478372271379606,
"acc_norm": 0.466156014223618,
"acc_norm_stderr": 0.035256245444643106,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.41421125437928746,
"mc2_stderr": 0.014859051283039066
},
"harness|arc:challenge|25": {
"acc": 0.5008532423208191,
"acc_stderr": 0.014611369529813283,
"acc_norm": 0.53839590443686,
"acc_norm_stderr": 0.014568245550296358
},
"harness|hellaswag|10": {
"acc": 0.5638319059948218,
"acc_stderr": 0.004948952519517518,
"acc_norm": 0.7577175861382195,
"acc_norm_stderr": 0.004275886276011774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.03308530426228258,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.03308530426228258
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688932,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688932
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5505050505050505,
"acc_stderr": 0.035441324919479704,
"acc_norm": 0.5505050505050505,
"acc_norm_stderr": 0.035441324919479704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658751,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658751
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329882,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329882
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.035086373586305716,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.035086373586305716
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811225,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811225
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889118,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889118
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6245210727969349,
"acc_stderr": 0.01731661319718279,
"acc_norm": 0.6245210727969349,
"acc_norm_stderr": 0.01731661319718279
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.0269150473553698,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.0269150473553698
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.02773125864701199,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.02773125864701199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49382716049382713,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.49382716049382713,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3533246414602347,
"acc_stderr": 0.01220840821108243,
"acc_norm": 0.3533246414602347,
"acc_norm_stderr": 0.01220840821108243
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44281045751633985,
"acc_stderr": 0.020095083154577347,
"acc_norm": 0.44281045751633985,
"acc_norm_stderr": 0.020095083154577347
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.033455630703391914,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.033455630703391914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.41421125437928746,
"mc2_stderr": 0.014859051283039066
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404674
},
"harness|gsm8k|5": {
"acc": 0.1463229719484458,
"acc_stderr": 0.009735210557785264
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
fedyanin/safety_llamas | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
splits:
- name: train
num_bytes: 1052833
num_examples: 2483
download_size: 547119
dataset_size: 1052833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Safety instruction and responses from https://github.com/vinid/safety-tuned-llamas
Training dataset only
I've posted here for ease of use, for further details please refer repo above and [the paper](https://arxiv.org/abs/2309.07875) |
dputilov/frontshadowlargesize | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: mask
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2090600714.0119998
num_examples: 1214
download_size: 2064654477
dataset_size: 2090600714.0119998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "frontshadowlargesize"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Norod78/futurama-blip-captions-512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 233983029.0
num_examples: 834
download_size: 234014778
dataset_size: 233983029.0
---
# Dataset Card for "futurama-blip-captions-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft | ---
pretty_name: Evaluation run of Yukang/Llama-2-13b-longlora-16k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-13b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T13:51:53.444348](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft/blob/main/results_2023-10-25T13-51-53.444348.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.2478295185477506,\n \"acc_stderr\": 0.007025978032038448\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076896\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T13_51_53.444348
path:
- '**/details_harness|drop|3_2023-10-25T13-51-53.444348.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T13-51-53.444348.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T13_51_53.444348
path:
- '**/details_harness|gsm8k|5_2023-10-25T13-51-53.444348.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T13-51-53.444348.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T13_51_53.444348
path:
- '**/details_harness|winogrande|5_2023-10-25T13-51-53.444348.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T13-51-53.444348.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- results_2023-10-10T13-32-51.379088.parquet
- split: 2023_10_25T13_51_53.444348
path:
- results_2023-10-25T13-51-53.444348.parquet
- split: latest
path:
- results_2023-10-25T13-51-53.444348.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-13b-longlora-16k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T13:51:53.444348](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft/blob/main/results_2023-10-25T13-51-53.444348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.2478295185477506,
"acc_stderr": 0.007025978032038448
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076896
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/bbe01f48 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 217
num_examples: 10
download_size: 1377
dataset_size: 217
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bbe01f48"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_hotpot_train1000_eval200_v1_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 173266
num_examples: 1000
- name: train_recite_qa
num_bytes: 1024784
num_examples: 1000
- name: eval_qa
num_bytes: 33160
num_examples: 200
- name: eval_recite_qa
num_bytes: 208740
num_examples: 200
- name: all_docs
num_bytes: 1054269
num_examples: 2373
- name: train
num_bytes: 173266
num_examples: 1000
- name: validation
num_bytes: 33160
num_examples: 200
download_size: 1700646
dataset_size: 2700645
---
# Dataset Card for "lmind_hotpot_train1000_eval200_v1_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datasets-examples/doc-yaml-1 | ---
configs:
- config_name: default
data_files:
- split: train
path: "data.csv"
- split: test
path: "holdout.csv"
size_categories:
- n<1K
---
# [doc] manual configuration 1
This dataset contains two csv files at the root, and a YAML field `configs` that specifies the data files and splits.
|
TeraflopAI/Caselaw_Access_Project | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- en
tags:
- legal
- law
- caselaw
pretty_name: Caselaw Access Project
size_categories:
- 1M<n<10M
---
<img src="https://huggingface.co/datasets/TeraflopAI/Caselaw_Access_project/resolve/main/cap.png" width="800">
# The Caselaw Access Project
In collaboration with Ravel Law, Harvard Law Library digitized over 40 million U.S. court decisions consisting of 6.7 million cases from the last 360 years into a dataset that is widely accessible to use. Access a bulk download of the data through the Caselaw Access Project API (CAPAPI): https://case.law/caselaw/
Find more information about accessing state and federal written court decisions of common law through the bulk data service documentation here: https://case.law/docs/
Learn more about the Caselaw Access Project and all of the phenomenal work done by Jack Cushman, Greg Leppert, and Matteo Cargnelutti here: https://case.law/about/
Watch a live stream of the data release here: https://lil.law.harvard.edu/about/cap-celebration/stream
# Post-processing
Teraflop AI is excited to help support the Caselaw Access Project and Harvard Library Innovation Lab, in the release of over 6.6 million state and federal court decisions published throughout U.S. history. It is important to democratize fair access to data to the public, legal community, and researchers. This is a processed and cleaned version of the original CAP data.
During the digitization of these texts, there were erroneous OCR errors that occurred. We worked to post-process each of the texts for model training to fix encoding, normalization, repetition, redundancy, parsing, and formatting.
Teraflop AI’s data engine allows for the massively parallel processing of web-scale datasets into cleaned text form. Our one-click deployment allowed us to easily split the computation between 1000s of nodes on our managed infrastructure.
# Licensing Information
The Caselaw Access Project dataset is licensed under the [CC0 License](https://creativecommons.org/public-domain/cc0/).
# Citation Information
```
The President and Fellows of Harvard University. "Caselaw Access Project." 2024, https://case.law/
```
```
@misc{ccap,
title={Cleaned Caselaw Access Project},
author={Enrico Shippole, Aran Komatsuzaki},
howpublished{\url{https://huggingface.co/datasets/TeraflopAI/Caselaw_Access_Project}},
year={2024}
}
``` |
keirp/hungarian_national_hs_finals_exam | ---
configs:
- config_name: default
data_files:
- split: test
path: test.csv
dataset_info:
features:
- name: Question
dtype: string
splits:
- name: test
num_bytes: 9300
num_examples: 33
download_size: 6283
dataset_size: 9300
---
# Testing Language Models on a Held-Out High School National Finals Exam
When xAI recently released [Grok-1](https://x.ai/), they evaluated it on the 2023 Hungarian national high school finals in mathematics, which was published after the training data cutoff for all the models in their evaluation. While MATH and GSM8k are the standard benchmarks for evaluating the mathematical abilities of large language models, **there are risks that modern models overfit to these datasets**, either from training directly on the test sets or from tuning the model hyperparameters to maximize test set performance. By evaluating on a truly held out test set, we can better guage the mathematical performance of these models.
We evaluate on the [2023 Hungarian national high school finals in mathematics](https://dload-oktatas.educatio.hu/erettsegi/feladatok_2023tavasz_kozep/k_matang_23maj_fl.pdf) and grade by hand using [the provided rubric](https://dload-oktatas.educatio.hu/erettsegi/feladatok_2023tavasz_kozep/k_matang_23maj_ut.pdf). All model solutions were graded by myself over the course of one day. Model solutions were sampled using temperature 0.1.
For base models such as Code Llama, Llemma, and Mistral-7B, a 5-shot prompt was used. For instruction tuned models, we used the default prompt template for that model.
## Results
**Note**: In an earlier version of the LaTeX transcription of the exam, question 14a had incorrect formatting and question 14b did not contain all the required information to solve the problem. These issues have been fixed and the numbers are updated.
| Model | Exam Score | GSM8k | MATH |
| ------------------------------------------------------------------------------ | ---------- | ------ | ------ |
| [Code Llama 7B](https://huggingface.co/codellama/CodeLlama-7b-hf) (few-shot) | 8\% | 10.5% | 4.5% |
| [MetaMath 7B](https://huggingface.co/meta-math/MetaMath-7B-V1.0) | 20\% | 66.5\% | 19.8\% |
| [MAmmoTH 7B](https://huggingface.co/TIGER-Lab/MAmmoTH-7B) | 17\% | 50.5\% | 10.4\% |
| [MAmmoTH Coder 7B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-7B) | 11\% | 22.5\% | 7.9\% |
| [Llemma 7B](https://huggingface.co/EleutherAI/llemma_7b) (few-shot) | 23\% | 36.4\% | 18\% |
| - | - | - | - |
| [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1) (few-shot) | 22\% | 39.2\% | - |
| [MetaMath Mistral 7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B) | 29\% | 77.7\% | 28.2\% |
| [OpenChat 3.5](https://huggingface.co/openchat/openchat_3.5) | 37\% | 77.3\% | 28.6\% |
| - | - | - | - |
| [Code Llama 34B](https://huggingface.co/codellama/CodeLlama-34b-hf) (few-shot) | 15\% | 29.6\% | 12.2\% |
| [MAmmoTH Coder 34B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-34B) | 17\% | 34.3\% | 11.6\% |
| [Llemma 34B](https://huggingface.co/EleutherAI/llemma_34b) (few-shot) | 43\% | 51.5\% | 25.0\% |
| - | - | - | - |
| [Qwen 7B](https://huggingface.co/eleutherai/qwen-7b) | 22\% | 51.7\% | 11.6\% |
| - | - | - | - |
| [Cohere Command](https://cohere.com/) | 18\% | - | - |
| [GPT-3.5 Turbo](https://openai.com/) | 41\% | 57.1\% | 23.5\% |
| [GPT-4](https://openai.com/) | 68\% | 92.0\% | 42.5\% |
| [Claude 2](https://www.anthropic.com/) | 55\% | 88.0\% | - |
| [Grok-0 (33B)](https://x.ai/) | 37\% | 56.8\% | 15.7\% |
| [Grok-1](https://x.ai/) | 59\% | 62.9\% | 23.9\% |
## Observations
1. Plotting GSM8k performance versus performance on the exam, we can see clear evidence that several models overfit to the benchmark.

2. Despite [claiming](https://huggingface.co/openchat/openchat_3.5#comparison-with-xai-grok-models) that OpenChat 3.5 is competitive with Grok-1, it only gets around half the score on the held-out exam, indicating that is simply overfits to evaluations.
3. Llemma 34B is competitive with GPT-3.5 Turbo on the held-out exam. Further instruction tuning Llemma 34B should give even greater performance.
## Solutions
Please find model solutions and corresponding grades in the `solutions` folder.
## Running the Evaluation
To run the evaluation, run the following command:
```bash
python run_exam.py --model EleutherAI/llemma_34b --exam test/exam.csv --prompt few_shot
```
## Notes on Grading
There are a few problems which either require creating or reading a figure. For these problems, I graded the model solutions as incorrect. In the future when models have these abilities, this should be changed.
## Citation
To cite this article, use the following citation:
```bibtex
@misc{testing_language_models_on_a_held_out_high_school_national_finals_exam,
title={Testing Language Models on a Held-Out High School National Finals Exam},
author={Keiran Paster},
howpublished={\url{https://huggingface.co/datasets/keirp/hungarian_national_hs_finals_exam}},
journal = {HuggingFace repository},
year={2023},
}
```
|
VedCodes/easy_share_dataset | ---
task_categories:
- text-generation
language:
- en
tags:
- medical
pretty_name: tinyLily
size_categories:
- n<1K
--- |
heliosprime/twitter_dataset_1712921879 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8873
num_examples: 20
download_size: 9892
dataset_size: 8873
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712921879"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EJaalborg2022/go-emotion-dk-autotranlated-10k | ---
dataset_info:
features:
- name: text_en
dtype: string
- name: text
dtype: string
- name: labels
dtype:
class_label:
names:
'0': admiration
'1': amusement
'2': anger
'3': annoyance
'4': approval
'5': caring
'6': confusion
'7': curiosity
'8': desire
'9': disappointment
'10': disapproval
'11': disgust
'12': embarrassment
'13': excitement
'14': fear
'15': gratitude
'16': grief
'17': joy
'18': love
'19': nervousness
'20': neutral
'21': optimism
'22': pride
'23': realization
'24': relief
'25': remorse
'26': sadness
'27': surprise
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 2912226
num_examples: 9000
- name: test
num_bytes: 163941
num_examples: 500
- name: valid
num_bytes: 161713
num_examples: 500
download_size: 1657402
dataset_size: 3237880
---
# Dataset Card for "go-emotion-dk-autotranlated-10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ssbuild/alpaca_csl | ---
license: apache-2.0
---
|
Harshithacj123/new_data_model_methanol_100 | ---
dataset_info:
features:
- name: Train
dtype: string
splits:
- name: train
num_bytes: 94264
num_examples: 100
download_size: 41170
dataset_size: 94264
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1 | ---
pretty_name: Evaluation run of xriminact/TarsChattyBasev0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xriminact/TarsChattyBasev0.1](https://huggingface.co/xriminact/TarsChattyBasev0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T13:28:46.282791](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1/blob/main/results_2024-01-18T13-28-46.282791.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5543673608069786,\n\
\ \"acc_stderr\": 0.03400787921234687,\n \"acc_norm\": 0.5627560913076616,\n\
\ \"acc_norm_stderr\": 0.034799828523172316,\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.41412077793772695,\n\
\ \"mc2_stderr\": 0.014661007860915117\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670444,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6289583748257319,\n\
\ \"acc_stderr\": 0.004820962855749738,\n \"acc_norm\": 0.8241386178052181,\n\
\ \"acc_norm_stderr\": 0.0037992414085029564\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983045,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.025988500792411894,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.025988500792411894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461213,\n\
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461213\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.019342036587702588,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.019342036587702588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138598,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n\
\ \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.7381864623243933,\n\
\ \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.014635185616527827,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.014635185616527827\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.02795604616542452,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.02795604616542452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.027466610213140112,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.027466610213140112\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871588,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871588\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n\
\ \"acc_stderr\": 0.01256788267380368,\n \"acc_norm\": 0.41134289439374183,\n\
\ \"acc_norm_stderr\": 0.01256788267380368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003483,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003483\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.046313813194254656,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.046313813194254656\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087555,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087555\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.41412077793772695,\n\
\ \"mc2_stderr\": 0.014661007860915117\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09401061410159212,\n \
\ \"acc_stderr\": 0.008038819818872452\n }\n}\n```"
repo_url: https://huggingface.co/xriminact/TarsChattyBasev0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-46.282791.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- '**/details_harness|winogrande|5_2024-01-18T13-28-46.282791.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T13-28-46.282791.parquet'
- config_name: results
data_files:
- split: 2024_01_18T13_28_46.282791
path:
- results_2024-01-18T13-28-46.282791.parquet
- split: latest
path:
- results_2024-01-18T13-28-46.282791.parquet
---
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.1](https://huggingface.co/xriminact/TarsChattyBasev0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:28:46.282791](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1/blob/main/results_2024-01-18T13-28-46.282791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5543673608069786,
"acc_stderr": 0.03400787921234687,
"acc_norm": 0.5627560913076616,
"acc_norm_stderr": 0.034799828523172316,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.41412077793772695,
"mc2_stderr": 0.014661007860915117
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670444,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809174
},
"harness|hellaswag|10": {
"acc": 0.6289583748257319,
"acc_stderr": 0.004820962855749738,
"acc_norm": 0.8241386178052181,
"acc_norm_stderr": 0.0037992414085029564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983045,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461213,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461213
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.019342036587702588,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.019342036587702588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138598,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527827,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527827
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.02795604616542452,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.02795604616542452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140112,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140112
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871588,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41134289439374183,
"acc_stderr": 0.01256788267380368,
"acc_norm": 0.41134289439374183,
"acc_norm_stderr": 0.01256788267380368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.046313813194254656,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.046313813194254656
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087555,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087555
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.41412077793772695,
"mc2_stderr": 0.014661007860915117
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011874
},
"harness|gsm8k|5": {
"acc": 0.09401061410159212,
"acc_stderr": 0.008038819818872452
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_41 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1270326700.0
num_examples: 249475
download_size: 1291728996
dataset_size: 1270326700.0
---
# Dataset Card for "chunk_41"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/sort_low_all_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 79661311
num_examples: 87599
download_size: 46213808
dataset_size: 79661311
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sort_low_all_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Morpheus777/xoda_project | ---
license: openrail
---
|
AngelBottomless/Konachan2023 | ---
license: mit
---
|
purnasai/SEC-10Q-10K-Coverpage | ---
license: cc-by-nc-4.0
task_categories:
- token-classification
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
---
|
jtatman/samantha_instruction_format | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 52003709
num_examples: 34687
download_size: 20270156
dataset_size: 52003709
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "samantha_instruction_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Echoshe/dataset1 | ---
license: mit
---
|
Shivani-3112/medicine-review | ---
license: apache-2.0
---
|
ndavidson/cisco_dcnm_finetuning_chatml | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Text
dtype: string
- name: Answer
dtype: string
- name: prompt_and_answer
dtype: string
splits:
- name: train
num_bytes: 20877657
num_examples: 2506
download_size: 5705767
dataset_size: 20877657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PhilKey/llama2-openrewrite-docs-chat | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 91751
num_examples: 95
download_size: 26661
dataset_size: 91751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ShoukanLabs/LAION-DallE-3-Local | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 1531813332.75
num_examples: 1250
download_size: 1176337783
dataset_size: 1531813332.75
---
# Dataset Card for "LAION-DallE-3-Local"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v1-math-1bbcaf-1917164992 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v1
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-13b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v1
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v1
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-13b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v1
* Config: mathemakitten--winobias_antistereotype_test_cot_v1
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
markbotterill/imdb_embeddings | ---
license: openrail
---
|
jxm/sst2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4412207
num_examples: 67349
- name: test
num_bytes: 209356
num_examples: 1821
- name: dev
num_bytes: 29783
num_examples: 256
download_size: 2898575
dataset_size: 4651346
---
# Dataset Card for "sst2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Jannie | ---
pretty_name: Evaluation run of TeeZee/Xwin-LM-70B-V0.1_Jannie
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/Xwin-LM-70B-V0.1_Jannie](https://huggingface.co/TeeZee/Xwin-LM-70B-V0.1_Jannie)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Jannie\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T10:59:59.024492](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Jannie/blob/main/results_2024-01-25T10-59-59.024492.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6920145646239103,\n\
\ \"acc_stderr\": 0.03050636265571641,\n \"acc_norm\": 0.6988351160849339,\n\
\ \"acc_norm_stderr\": 0.031092908305197647,\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.6014420746258905,\n\
\ \"mc2_stderr\": 0.015084429602270856\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902276,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428178\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6803425612427804,\n\
\ \"acc_stderr\": 0.004653907471785645,\n \"acc_norm\": 0.8685520812587134,\n\
\ \"acc_norm_stderr\": 0.0033719902188524583\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.025620857042936655,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.025620857042936655\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172527,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172527\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776826,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776826\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377352,\n \"\
acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377352\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.032484700838071943,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.032484700838071943\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5865921787709497,\n\
\ \"acc_stderr\": 0.016469814928406164,\n \"acc_norm\": 0.5865921787709497,\n\
\ \"acc_norm_stderr\": 0.016469814928406164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157375,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.546284224250326,\n\
\ \"acc_stderr\": 0.012715404841277752,\n \"acc_norm\": 0.546284224250326,\n\
\ \"acc_norm_stderr\": 0.012715404841277752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146616,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146616\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.0261682213446623,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.0261682213446623\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.6014420746258905,\n\
\ \"mc2_stderr\": 0.015084429602270856\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989248\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40788476118271416,\n \
\ \"acc_stderr\": 0.013536742075643085\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/Xwin-LM-70B-V0.1_Jannie
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|arc:challenge|25_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|gsm8k|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hellaswag|10_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-59.024492.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T10-59-59.024492.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- '**/details_harness|winogrande|5_2024-01-25T10-59-59.024492.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T10-59-59.024492.parquet'
- config_name: results
data_files:
- split: 2024_01_25T10_59_59.024492
path:
- results_2024-01-25T10-59-59.024492.parquet
- split: latest
path:
- results_2024-01-25T10-59-59.024492.parquet
---
# Dataset Card for Evaluation run of TeeZee/Xwin-LM-70B-V0.1_Jannie
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/Xwin-LM-70B-V0.1_Jannie](https://huggingface.co/TeeZee/Xwin-LM-70B-V0.1_Jannie) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Jannie",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T10:59:59.024492](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Jannie/blob/main/results_2024-01-25T10-59-59.024492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6920145646239103,
"acc_stderr": 0.03050636265571641,
"acc_norm": 0.6988351160849339,
"acc_norm_stderr": 0.031092908305197647,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.6014420746258905,
"mc2_stderr": 0.015084429602270856
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902276,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428178
},
"harness|hellaswag|10": {
"acc": 0.6803425612427804,
"acc_stderr": 0.004653907471785645,
"acc_norm": 0.8685520812587134,
"acc_norm_stderr": 0.0033719902188524583
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.025620857042936655,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.025620857042936655
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172527,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172527
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776826,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377352,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377352
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.032484700838071943,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.032484700838071943
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5865921787709497,
"acc_stderr": 0.016469814928406164,
"acc_norm": 0.5865921787709497,
"acc_norm_stderr": 0.016469814928406164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157375,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.546284224250326,
"acc_stderr": 0.012715404841277752,
"acc_norm": 0.546284224250326,
"acc_norm_stderr": 0.012715404841277752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146616,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146616
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.0261682213446623,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.0261682213446623
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.6014420746258905,
"mc2_stderr": 0.015084429602270856
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989248
},
"harness|gsm8k|5": {
"acc": 0.40788476118271416,
"acc_stderr": 0.013536742075643085
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
seokheeyam/funsd-bank-paragraph-test3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-HEADER
'2': I-HEADER
'3': B-QUESTION
'4': I-QUESTION
'5': B-ANSWER
'6': I-ANSWER
- name: image
dtype: image
- name: bboxes_block
sequence:
sequence: int64
- name: texts
sequence: string
- name: labels
sequence:
class_label:
names:
'0': other
'1': header
'2': question
'3': answer
splits:
- name: train
num_bytes: 1010809462.0
num_examples: 125
- name: test
num_bytes: 137515204.0
num_examples: 15
download_size: 1145069495
dataset_size: 1148324666.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
joey234/mmlu-moral_disputes-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 8440
num_examples: 5
- name: test
num_bytes: 3543294
num_examples: 346
download_size: 310275
dataset_size: 3551734
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-moral_disputes-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_perfect_already | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 249961
num_examples: 575
- name: train
num_bytes: 214547
num_examples: 452
download_size: 302174
dataset_size: 464508
---
# Dataset Card for "MULTI_VALUE_rte_perfect_already"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/fine_art_photography_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 132266
num_examples: 1000
download_size: 5939
dataset_size: 132266
---
# Dataset Card for "fine_art_photography_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-103000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1023678
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pcostarotela/dataset-test-p1 | ---
license: mit
---
|
partitionsofunity/DataComp-1B | ---
license: cc-by-nc-nd-4.0
---
|
Azazelle/Genderswap | ---
license: openrail
---
|
ericyu/SYSU_CD | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3393267984.0
num_examples: 12000
- name: test
num_bytes: 1196988392.0
num_examples: 4000
- name: val
num_bytes: 1164865940.0
num_examples: 4000
download_size: 5814133284
dataset_size: 5755122316.0
---
# Dataset Card for "SYSU_CD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_18_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 941
num_examples: 32
download_size: 2039
dataset_size: 941
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_18_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gabriel1322/datasetlira | ---
license: openrail
---
|
daytoy-models/coronary-artery | ---
pipeline_tag: text-generation
language:
- 'no'
- aa
inference: true
widget:
- text: 'def print_hello_world():'
example_title: Hello world
group: Python
license: bigcode-openrail-m
datasets:
- bigcode/the-stack-dedup
metrics:
- code_eval
library_name: transformers
tags:
- code
model-index:
- name: StarCoder
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval (Prompted)
metrics:
- name: pass@1
type: pass@1
value: 0.408
verified: false
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1
type: pass@1
value: 0.336
verified: false
- task:
type: text-generation
dataset:
type: mbpp
name: MBPP
metrics:
- name: pass@1
type: pass@1
value: 0.527
verified: false
- task:
type: text-generation
dataset:
type: ds1000
name: DS-1000 (Overall Completion)
metrics:
- name: pass@1
type: pass@1
value: 0.26
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (C++)
metrics:
- name: pass@1
type: pass@1
value: 0.3155
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (C#)
metrics:
- name: pass@1
type: pass@1
value: 0.2101
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (D)
metrics:
- name: pass@1
type: pass@1
value: 0.1357
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Go)
metrics:
- name: pass@1
type: pass@1
value: 0.1761
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Java)
metrics:
- name: pass@1
type: pass@1
value: 0.3022
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Julia)
metrics:
- name: pass@1
type: pass@1
value: 0.2302
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (JavaScript)
metrics:
- name: pass@1
type: pass@1
value: 0.3079
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Lua)
metrics:
- name: pass@1
type: pass@1
value: 0.2389
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (PHP)
metrics:
- name: pass@1
type: pass@1
value: 0.2608
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Perl)
metrics:
- name: pass@1
type: pass@1
value: 0.1734
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Python)
metrics:
- name: pass@1
type: pass@1
value: 0.3357
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (R)
metrics:
- name: pass@1
type: pass@1
value: 0.155
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Ruby)
metrics:
- name: pass@1
type: pass@1
value: 0.0124
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Racket)
metrics:
- name: pass@1
type: pass@1
value: 0.0007
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Rust)
metrics:
- name: pass@1
type: pass@1
value: 0.2184
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Scala)
metrics:
- name: pass@1
type: pass@1
value: 0.2761
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Bash)
metrics:
- name: pass@1
type: pass@1
value: 0.1046
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Swift)
metrics:
- name: pass@1
type: pass@1
value: 0.2274
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (TypeScript)
metrics:
- name: pass@1
type: pass@1
value: 0.3229
verified: false
extra_gated_prompt: >-
## Model License Agreement Please read the BigCode [OpenRAIL-M
license](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement)
agreement before accepting it.
extra_gated_fields:
I accept the above license agreement, and will use the Model complying with the set of use restrictions and sharing requirements: checkbox
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BAAI/COIG-PC-core | ---
extra_gated_heading: "Acknowledge license to accept the repository"
extra_gated_prompt: |
北京智源人工智能研究院(以下简称“我们”或“研究院”)通过BAAI DataHub(data.baai.ac.cn)和COIG-PC HuggingFace仓库(https://huggingface.co/datasets/BAAI/COIG-PC)向您提供开源数据集(以下或称“数据集”),您可通过下载的方式获取您所需的开源数据集,并在遵守各原始数据集使用规则前提下,基于学习、研究、商业等目的使用相关数据集。
在您获取(包括但不限于访问、下载、复制、传播、使用等处理数据集的行为)开源数据集前,您应认真阅读并理解本《COIG-PC开源数据集使用须知与免责声明》(以下简称“本声明”)。一旦您获取开源数据集,无论您的获取方式为何,您的获取行为均将被视为对本声明全部内容的认可。
1. 平台的所有权与运营权
您应充分了解并知悉,BAAI DataHub和COIG-PC HuggingFace仓库(包括当前版本及全部历史版本)的所有权与运营权归智源人工智能研究院所有,智源人工智能研究院对本平台/本工具及开源数据集开放计划拥有最终解释权和决定权。
您知悉并理解,基于相关法律法规更新和完善以及我们需履行法律合规义务的客观变化,我们保留对本平台/本工具进行不定时更新、维护,或者中止乃至永久终止提供本平台/本工具服务的权利。我们将在合理时间内将可能发生前述情形通过公告或邮件等合理方式告知您,您应当及时做好相应的调整和安排,但我们不因发生前述任何情形对您造成的任何损失承担任何责任。
2. 开源数据集的权利主张
为了便于您基于学习、研究、商业的目的开展数据集获取、使用等活动,我们对第三方原始数据集进行了必要的格式整合、数据清洗、标注、分类、注释等相关处理环节,形成可供本平台/本工具用户使用的开源数据集。
您知悉并理解,我们不对开源数据集主张知识产权中的相关财产性权利,因此我们亦无相应义务对开源数据集可能存在的知识产权进行主动识别和保护,但这不意味着我们放弃开源数据集主张署名权、发表权、修改权和保护作品完整权(如有)等人身性权利。而原始数据集可能存在的知识产权及相应合法权益由原权利人享有。
此外,向您开放和使用经合理编排、加工和处理后的开源数据集,并不意味着我们对原始数据集知识产权、信息内容等真实、准确或无争议的认可,您应当自行筛选、仔细甄别,使用经您选择的开源数据集。您知悉并同意,研究院对您自行选择使用的原始数据集不负有任何无缺陷或无瑕疵的承诺义务或担保责任。
3. 开源数据集的使用限制
您使用数据集不得侵害我们或任何第三方的合法权益(包括但不限于著作权、专利权、商标权等知识产权与其他权益)。
获取开源数据集后,您应确保对开源数据集的使用不超过原始数据集的权利人以公示或协议等形式明确规定的使用规则,包括原始数据的使用范围、目的和合法用途等。我们在此善意地提请您留意,如您对开源数据集的使用超出原始数据集的原定使用范围及用途,您可能面临侵犯原始数据集权利人的合法权益例如知识产权的风险,并可能承担相应的法律责任。
4. 个人信息保护
基于技术限制及开源数据集的公益性质等客观原因,我们无法保证开源数据集中不包含任何个人信息,我们不对开源数据集中可能涉及的个人信息承担任何法律责任。
如开源数据集涉及个人信息,我们不对您使用开源数据集可能涉及的任何个人信息处理行为承担法律责任。我们在此善意地提请您留意,您应依据《个人信息保护法》等相关法律法规的规定处理个人信息。
为了维护信息主体的合法权益、履行可能适用的法律、行政法规的规定,如您在使用开源数据集的过程中发现涉及或者可能涉及个人信息的内容,应立即停止对数据集中涉及个人信息部分的使用,并及时通过“6. 投诉与通知”中载明的联系我们。
5. 信息内容管理
我们不对开源数据集可能涉及的违法与不良信息承担任何法律责任。
如您在使用开源数据集的过程中发现开源数据集涉及或者可能涉及任何违法与不良信息,您应立即停止对数据集中涉及违法与不良信息部分的使用,并及时通过“6. 投诉与通知”中载明的联系我们。
6. 投诉与通知
如您认为开源数据集侵犯了您的合法权益,您可通过010-50955974联系我们,我们会及时依法处理您的主张与投诉。
为了处理您的主张和投诉,我们可能需要您提供联系方式、侵权证明材料以及身份证明等材料。请注意,如果您恶意投诉或陈述失实,您将承担由此造成的全部法律责任(包括但不限于合理的费用赔偿等)。
7. 责任声明
您理解并同意,基于开源数据集的性质,数据集中可能包含来自不同来源和贡献者的数据,其真实性、准确性、客观性等可能会有所差异,我们无法对任何数据集的可用性、可靠性等做出任何承诺。
在任何情况下,我们不对开源数据集可能存在的个人信息侵权、违法与不良信息传播、知识产权侵权等任何风险承担任何法律责任。
在任何情况下,我们不对您因开源数据集遭受的或与之相关的任何损失(包括但不限于直接损失、间接损失以及可得利益损失等)承担任何法律责任。
8. 其他
开源数据集处于不断发展、变化的阶段,我们可能因业务发展、第三方合作、法律法规变动等原因更新、调整所提供的开源数据集范围,或中止、暂停、终止开源数据集提供业务。
extra_gated_fields:
Name: text
Affiliation: text
Country: text
I agree to use this model for non-commercial use ONLY: checkbox
extra_gated_button_content: "Acknowledge license"
license: unknown
language:
- zh
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: task_type
struct:
- name: major
sequence: string
- name: minor
sequence: string
- name: domain
sequence: string
- name: other
dtype: string
- name: task_name_in_eng
dtype: string
- name: index
dtype: string
splits:
- name: train
num_bytes: 1053129000
num_examples: 744592
download_size: 416315627
dataset_size: 1053129000
---
# COIG Prompt Collection
## License
**Default Licensing for Sub-Datasets Without Specific License Declaration**: In instances where sub-datasets within the COIG-PC Dataset do not have a specific license declaration, the Apache License 2.0 (Apache-2.0) will be the applicable licensing terms by default.
**Precedence of Declared Licensing for Sub-Datasets**: For any sub-dataset within the COIG-PC Dataset that has an explicitly declared license, the terms and conditions of the declared license shall take precedence and govern the usage of that particular sub-dataset.
Users and developers utilizing the COIG-PC Dataset must ensure compliance with the licensing terms as outlined above. It is imperative to review and adhere to the specified licensing conditions of each sub-dataset, as they may vary.
## What is COIG-PC?
The COIG-PC Dataset is a meticulously curated and comprehensive collection of Chinese tasks and data, designed to facilitate the fine-tuning and optimization of language models for Chinese natural language processing (NLP). The dataset aims to provide researchers and developers with a rich set of resources to improve the capabilities of language models in handling Chinese text, which can be utilized in various fields such as text generation, information extraction, sentiment analysis, machine translation, among others.
If you think COIG-PC is too huge, please refer to [COIG-PC-Lite](https://huggingface.co/datasets/BAAI/COIG-PC-Lite) which is a subset of COIG-PC with only 200 samples from each task file.
## Why COIG-PC?
The COIG-PC Dataset is an invaluable resource for the domain of natural language processing (NLP) for various compelling reasons:
**Addressing Language Complexity**: Chinese is known for its intricacy, with a vast array of characters and diverse grammatical structures. A specialized dataset like COIG-PC, which is tailored for the Chinese language, is essential to adequately address these complexities during model training.
**Comprehensive Data Aggregation**: The COIG-PC Dataset is a result of an extensive effort in integrating almost all available Chinese datasets in the market. This comprehensive aggregation makes it one of the most exhaustive collections for Chinese NLP.
**Data Deduplication and Normalization**: The COIG-PC Dataset underwent rigorous manual processing to eliminate duplicate data and perform normalization. This ensures that the dataset is free from redundancy, and the data is consistent and well-structured, making it more user-friendly and efficient for model training.
**Fine-tuning and Optimization**: The dataset’s instruction-based phrasing facilitates better fine-tuning and optimization of language models. This structure allows models to better understand and execute tasks, which is particularly beneficial in improving performance on unseen or novel tasks.
The COIG-PC Dataset, with its comprehensive aggregation, meticulous selection, deduplication, and normalization of data, stands as an unmatched resource for training and optimizing language models tailored for the Chinese language and culture. It addresses the unique challenges of Chinese language processing and serves as a catalyst for advancements in Chinese NLP.
## Who builds COIG-PC?
The bedrock of COIG-PC is anchored in the dataset furnished by stardust.ai, which comprises an aggregation of data collected from the Internet.
And COIG-PC is the result of a collaborative effort involving engineers and experts from over twenty distinguished universities both domestically and internationally. Due to space constraints, it is not feasible to list all of them; however, the following are a few notable institutions among the collaborators:
- Beijing Academy of Artificial Intelligence, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/baai.png" alt= “BAAI” height="100" width="150">
- Peking University, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/pku.png" alt= “PKU” height="100" width="200">
- The Hong Kong University of Science and Technology (HKUST), China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/hkust.png" alt= “HKUST” height="100" width="200">
- The University of Waterloo, Canada
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/waterloo.png" alt= “Waterloo” height="100" width="150">
- The University of Sheffield, United Kingdom
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/sheffield.png" alt= “Sheffield” height="100" width="200">
- Beijing University of Posts and Telecommunications, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/bupt.png" alt= “BUPT” height="100" width="200">
- [Multimodal Art Projection](https://huggingface.co/m-a-p)
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/map.png" alt= “M.A.P” height="100" width="200">
- stardust.ai, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/stardust.png" alt= “stardust.ai” height="100" width="200">
- LinkSoul.AI, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC-core/resolve/main/assets/linksoul.png" alt= “linksoul.ai” height="100" width="200">
For the detailed list of engineers involved in the creation and refinement of COIG-PC, please refer to the paper that will be published subsequently. This paper will provide in-depth information regarding the contributions and the specifics of the dataset’s development process.
## How to use COIG-PC?
COIG-PC is structured in a **.jsonl** file format. Each line in the file represents a single data record and is structured in JSON (JavaScript Object Notation) format. Below is a breakdown of the elements within each line:
**instruction**: This is a text string that provides the instruction for the task. For example, it might tell the model what to do with the input data.
**input**: This is the input data that the model needs to process. In the context of translation, it would be the text that needs to be translated.
**output**: This contains the expected output data after processing the input. In the context of translation, it would be the translated text.
**split**: Indicates the official split of the original dataset, which is used to categorize data for different phases of model training and evaluation. It can be 'train', 'test', 'valid', etc.
**task_type**: Contains major and minor categories for the dataset. Major categories are broader, while minor categories can be more specific subcategories.
**domain**: Indicates the domain or field to which the data belongs.
**other**: This field can contain additional information or metadata regarding the data record. If there is no additional information, it may be set to null.
### Example
Here is an example of how a line in the COIG-PC dataset might be structured:
```
{
"instruction": "请把下面的中文句子翻译成英文",
"input": "我爱你。",
"output": "I love you.",
"split": "train",
"task_type": {
"major": ["翻译"],
"minor": ["翻译", "中译英"]
},
"domain": ["通用"],
"other": null
}
```
In this example:
**instruction** tells the model to translate the following Chinese sentence into English.
**input** contains the Chinese text "我爱你" which means "I love you".
**output** contains the expected translation in English: "I love you".
**split** indicates that this data record is part of the training set.
**task_type** specifies that the major category is "Translation" and the minor categories are "Translation" and "Chinese to English".
**domain** specifies that this data record belongs to the general domain.
**other** is set to null as there is no additional information for this data record.
## Update: Aug. 30, 2023
- v1.0: First version of COIG-PC-core.
## COIG-PC Citation
If you want to cite COIG-PC-core dataset, you could use this:
```
```
## Contact Us
To contact us feel free to create an Issue in this repository.
|
AI-C/codeformer-clone | ---
title: CodeFormer
emoji: 🐼
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 3.37.0
app_file: app.py
pinned: false
---
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
|
Duckets/DataSet | ---
license: cc
---
|
liuyanchen1015/VALUE_qnli_null_genetive | ---
dataset_info:
features:
- name: question
dtype: string
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 354308
num_examples: 1268
- name: test
num_bytes: 371796
num_examples: 1337
- name: train
num_bytes: 6138826
num_examples: 22557
download_size: 4346027
dataset_size: 6864930
---
# Dataset Card for "VALUE_qnli_null_genetive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gitbart/Polish_law | ---
license: other
task_categories:
- question-answering
language:
- pl
tags:
- legal
size_categories:
- n<1K
--- |
arieg/bw_spec_cls_4_22_noise_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1706'
'1': '1720'
'2': '1732'
'3': '1733'
splits:
- name: train
num_bytes: 43598869.0
num_examples: 800
- name: test
num_bytes: 1089488.0
num_examples: 20
download_size: 23887029
dataset_size: 44688357.0
---
# Dataset Card for "bw_spec_cls_4_22_noise_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vincha77/english_to_pirate | ---
dataset_info:
features:
- name: english
dtype: string
- name: pirate
dtype: string
splits:
- name: train
num_bytes: 533.6
num_examples: 8
- name: test
num_bytes: 66.7
num_examples: 1
- name: valid
num_bytes: 66.7
num_examples: 1
download_size: 5162
dataset_size: 667.0000000000001
---
# Dataset Card for "english_to_pirate"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adilhabibi/bioacoustic_mel_segments | ---
dataset_info:
features:
- name: segments
sequence:
sequence:
sequence: float32
- name: label_idices
dtype: int64
- name: label_names
dtype: string
splits:
- name: train
num_bytes: 72803953
num_examples: 1457
download_size: 53309954
dataset_size: 72803953
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bioacoustic_mel_segments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KentoTsu/ventilador | ---
license: openrail
---
|
oofnan/stegllm | ---
license: mit
---
|
namanyash/client-1-preprocessed-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7905847
num_examples: 6000
download_size: 4344166
dataset_size: 7905847
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-ac4402f5-7985078 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- beans
eval_info:
task: image_multi_class_classification
model: nateraw/vit-base-beans-demo-v3
metrics: []
dataset_name: beans
dataset_config: default
dataset_split: test
col_mapping:
image: image
target: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: nateraw/vit-base-beans-demo-v3
* Dataset: beans
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Ti-Ma/wikipedia_2019 | ---
license: cc-by-sa-3.0
---
|
AcasoQ/coachvoz | ---
license: openrail
---
|
spmml/fml-bankc | ---
license: unknown
---
|
cj-mills/labelme-instance-segmentation-toy-dataset | ---
license: mit
---
|
UNCANNY69/few_shot | ---
license: mit
---
|
jimzhiwei/amazon_product | ---
license: openrail
---
|
jinaai/hotpotqa-reranking-en | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: test
num_bytes: 239882851
num_examples: 7405
- name: eval
num_bytes: 1672050
num_examples: 50
download_size: 135721349
dataset_size: 241554901
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: eval
path: data/eval-*
---
|
xwjzds/hr_multiwoz_tod_sgd | ---
license: apache-2.0
---
# Dataset Card
<!-- Provide a quick summary of the dataset. -->
HR-Multiwoz is a fully-labeled dataset of 550 conversations spanning 10 HR domains to evaluate LLM Agent. It is the first labeled open-sourced conversation dataset in the HR domain for NLP research.
Please refer to [HR-MultiWOZ: A Task Oriented Dialogue (TOD) Dataset for HR LLM Agent](https://arxiv.org/pdf/2402.01018.pdf) for details about the dataset construction.
### Dataset Description
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [xwjzds/extractive_qa_question_answering_hr](https://huggingface.co/datasets/xwjzds/extractive_qa_question_answering_hr)
- **Paper:** [HR-MultiWOZ: A Task Oriented Dialogue (TOD) Dataset for HR LLM Agent](https://arxiv.org/pdf/2402.01018.pdf)
- **Leaderboard:** [github repo](https://github.com/amazon-science/hr-multiwoz-tod-llm-agent)
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset has been designed to evaluate transfer learning ability for TOD system and SGD algorithm.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset is not intended for use in training.
## Dataset Structure
### Data Instances
A typical data entry in the dataset consists of a the area of the conversation, turn_id, utterance, and state. Below is an example from the dataset:
```python
utterance = ["HR Assistant: No problem, tell me a bit about what kind of training you're looking for. What will you guys be learning about and do you need any special accommodations?", "Employee: We're hoping to improve our machine learning research skills. But no special accommodations are needed.", 'HR Assistant: Great, how many people will be attending this training? And approximately how long do you think it should last? ', "Employee: Three people will be participating and we'd like it to run for two days.", 'HR Assistant: Okay, thanks. Do you have a budget in mind for this? It would also help to know what your goals are for your team after the training.', 'Employee: Our budget is $5000. We want to come away with better research skills.', 'HR Assistant: No problem, when works best for your schedule? Also, do you have a preference for how the training is delivered, like online or in-person?', "Employee: We're hoping for some online seminars on June 15th, 2023.", "HR Assistant: Sounds good. One last thing - would you say your team's current skill level is beginner, intermediate or advanced? And can you provide the contact info for the person coordinating it all?", 'Employee: Their level is intermediate. The coordinator is Henry Kim and you can reach him at hkim@researchlab.com or 555-1234.']),
state = [{'training_topic': 'machine learning', 'special_requirements': 'no special accommodations are needed'}, {'number_of_participants': 'Three', 'training_duration': 'two days'}, {'budget_constraints': '$5000', 'desired_training_outcomes': 'better research skills'}, {'preferred_training_dates': 'June 15th, 2023', 'preferred_training_format': 'online seminars'}, {'current_skill_level': 'intermediate', 'contact_information': 'hkim@researchlab.com or 555-1234'}]
area = ['training_request']
```
### Data Fields
The dataset comprises the following fields:
-'dialogue_id': an int that is the id of the dialogue
-'service': a list contains relevant area of the dialogue
-'turn_id': a list of int which shows the turn id
-'speaker': a list of int where 0 represents HR Assistant and 1 represents Employee
-'utterance': a list of strings to represent the conversation
-'state': the state of entity and answer in each turn after Employee speak
## Dataset Creation
Please refer to [HR-MultiWOZ: A Task Oriented Dialogue (TOD) Dataset for HR LLM Agent](https://arxiv.org/pdf/2402.01018.pdf) for details about the dataset construction.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
Not Amazon
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
None
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
This dataset is in English and contains systhetic problems.
## Citation
If you find this work useful in your method, you can cite the paper as below:
```
@inproceedings{xu-etal-2024-hr,
title = "{HR}-{M}ulti{WOZ}: A Task Oriented Dialogue ({TOD}) Dataset for {HR} {LLM} Agent",
author = "Xu, Weijie and
Huang, Zicheng and
Hu, Wenxiang and
Fang, Xi and
Cherukuri, Rajesh and
Nayyar, Naumaan and
Malandri, Lorenzo and
Sengamedu, Srinivasan",
editor = "Hruschka, Estevam and
Lake, Thom and
Otani, Naoki and
Mitchell, Tom",
booktitle = "Proceedings of the First Workshop on Natural Language Processing for Human Resources (NLP4HR 2024)",
month = mar,
year = "2024",
address = "St. Julian{'}s, Malta",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.nlp4hr-1.5",
pages = "59--72",
abstract = "Recent advancements in Large Language Models (LLMs) have been reshaping Natural Language Processing (NLP) task in several domains. Their use in the field of Human Resources (HR) has still room for expansions and could be beneficial for several time consuming tasks. Examples such as time-off submissions, medical claims filing, and access requests are noteworthy, but they are by no means the sole instances. However the aforementioned developments must grapple with the pivotal challenge of constructing a high-quality training dataset. On one hand, most conversation datasets are solving problems for customers not employees. On the other hand, gathering conversations with HR could raise privacy concerns. To solve it, we introduce HR-Multiwoz, a fully-labeled dataset of 550 conversations spanning 10 HR domains. Our work has the following contributions:(1) It is the first labeled open-sourced conversation dataset in the HR domain for NLP research. (2) It provides a detailed recipe for the data generation procedure along with data analysis and human evaluations. The data generation pipeline is transferrable and can be easily adapted for labeled conversation data generation in other domains. (3) The proposed data-collection pipeline is mostly based on LLMs with minimal human involvement for annotation, which is time and cost-efficient.",
}
```
|
davebulaval/RISCBAC | ---
license:
- cc-by-4.0
multilinguality:
- monolingual
- aligned
task_categories:
- summarization
- question-answering
- translation
source_datasets:
- original
language:
- en
- fr
tags:
- unsupervised
pretty_name: Realistic Bilingual Synthetic Automobile Insurance Contract
size_categories:
- 10K<n<100K
dataset_info:
download_size: 376971
dataset_size: 611048
viewer: true
---
# Dataset Card for RISCBAC
RISCBAC was created using [RISC](https://github.com/GRAAL-Research/risc), an open-source Python package data generator. RISC generates look-alike automobile insurance contracts based on the Quebec regulatory insurance form in French and English.
It contains 10,000 English and French insurance contracts generated using the same seed. Thus, contracts share the same deterministic synthetic data (RISCBAC can be used as an aligned dataset). RISC can be used to generate more data for RISCBAC.
# Data Instances
## Default (`'fr'`)
The default data instance is the French version of the dataset. The dataset is comprised of 10,000 synthetic automobile insurance contracts.
## Other Option
The other data instance option is `"en"`. The dataset is comprised of 10,000 synthetic automobile insurance contracts.
# Citation Information
```
@misc{beaucheminrisc,
title={{RISC: Generating Realistic Synthetic Bilingual Insurance
Contract}},
author={David Beauchemin and Richard Khoury},
year={2023},
eprint={2304.04212},
archivePrefix={arXiv}
}
```
|
stolzenp/your-first-generated-dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 43559
num_examples: 100
download_size: 5356
dataset_size: 43559
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "your-first-generated-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuyijiong/Long-instruction-en2zh | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- conversational
- summarization
language:
- zh
size_categories:
- 10K<n<100K
---
* 2023.10.22更新:不是谷歌翻译的,更高质量的,[中文长文本问答数据集](https://huggingface.co/datasets/yuyijiong/Long-Instruction-Chinese)已经推出,但部分数据量还不足,正在持续增加中。
* 2023.10.18更新:删除一些重复和低质量数据。改进了答案和指令格式。
# 中文长文本指令微调数据集-汇编
由于目前中文数据不足,大部分数据都是从英文数据集通过谷歌翻译过来的,翻译质量略有待提升,目前勉强能用。未来可能还会增加数据。 \
大部分数据经过筛选,长度(字符数)大于8000,以满足长文本微调的需要 \
指令微调数据都已经转化为llama的chat格式 : ```"<s>Human: " + question + "\n</s>" + "<s>Assistant: “ + answer + "\n</s>"```
因为中文长度普遍短于英文,很多英文翻译为中文后,文本长度显著缩短。
数据组成:
## 1. LongAlpaca数据集
数据来源:[Yukang/LongAlpaca-12k](https://huggingface.co/datasets/Yukang/LongAlpaca-12k) \
原数据集已经被拆分为 book_sum、paper_qa、paper_review、paper_compare、paper_conclusion 5部分,一共约9k的长文本指令微调数据 \
原数据集中还包含3k的抽取自alpaca数据集的短文本指令微调,此项目中不包含这部分alpaca数据集。
## 2. 多文档问答数据集
数据来源: [togethercomputer/Long-Data-Collections](https://huggingface.co/datasets/togethercomputer/Long-Data-Collections) \
从8万多个英文的多文档问答数据中,选出翻译质量较好的4000条。指令中给出几十或上百个参考文档,只有一个文档含有有用信息,回答中不仅要回答问题,还需要指出参考文档的序号。
## 3. SharGPT中文长对话数据集
数据来源:[shibing624/sharegpt_gpt4](https://huggingface.co/datasets/shibing624/sharegpt_gpt4/tree/main) \
来自于gpt3.5和gpt4的多轮对话数据,仅包含中文,文本长度大于4000 |
liuyanchen1015/MULTI_VALUE_stsb_remove_det_indefinite | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 98635
num_examples: 717
- name: test
num_bytes: 78464
num_examples: 640
- name: train
num_bytes: 291054
num_examples: 2216
download_size: 271778
dataset_size: 468153
---
# Dataset Card for "MULTI_VALUE_stsb_remove_det_indefinite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.