id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
talentlabs/training-data-blog-writer_v03-09-2023 | 2023-09-03T10:46:58.000Z | [
"region:us"
] | talentlabs | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 48639092
num_examples: 9504
download_size: 30032406
dataset_size: 48639092
---
# Dataset Card for "training-data-blog-writer_v03-09-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RWKV__rwkv-raven-7b | 2023-09-17T20:05:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-raven-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-raven-7b](https://huggingface.co/RWKV/rwkv-raven-7b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-raven-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T20:05:27.789159](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-raven-7b/blob/main/results_2023-09-17T20-05-27.789159.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10968959731543625,\n\
\ \"em_stderr\": 0.0032003177207929542,\n \"f1\": 0.15844798657718087,\n\
\ \"f1_stderr\": 0.0033868319824486323,\n \"acc\": 0.3160387943079502,\n\
\ \"acc_stderr\": 0.007545486731728054\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10968959731543625,\n \"em_stderr\": 0.0032003177207929542,\n\
\ \"f1\": 0.15844798657718087,\n \"f1_stderr\": 0.0033868319824486323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245395\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6290449881610103,\n \"acc_stderr\": 0.013576399902231568\n\
\ }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-raven-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|arc:challenge|25_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T20_05_27.789159
path:
- '**/details_harness|drop|3_2023-09-17T20-05-27.789159.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T20-05-27.789159.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T20_05_27.789159
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-05-27.789159.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-05-27.789159.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hellaswag|10_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:19:01.730626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T10:19:01.730626.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T10:19:01.730626.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T20_05_27.789159
path:
- '**/details_harness|winogrande|5_2023-09-17T20-05-27.789159.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T20-05-27.789159.parquet'
- config_name: results
data_files:
- split: 2023_09_03T10_19_01.730626
path:
- results_2023-09-03T10:19:01.730626.parquet
- split: 2023_09_17T20_05_27.789159
path:
- results_2023-09-17T20-05-27.789159.parquet
- split: latest
path:
- results_2023-09-17T20-05-27.789159.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-raven-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-raven-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-raven-7b](https://huggingface.co/RWKV/rwkv-raven-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-raven-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T20:05:27.789159](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-raven-7b/blob/main/results_2023-09-17T20-05-27.789159.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10968959731543625,
"em_stderr": 0.0032003177207929542,
"f1": 0.15844798657718087,
"f1_stderr": 0.0033868319824486323,
"acc": 0.3160387943079502,
"acc_stderr": 0.007545486731728054
},
"harness|drop|3": {
"em": 0.10968959731543625,
"em_stderr": 0.0032003177207929542,
"f1": 0.15844798657718087,
"f1_stderr": 0.0033868319824486323
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245395
},
"harness|winogrande|5": {
"acc": 0.6290449881610103,
"acc_stderr": 0.013576399902231568
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sekarmulyani/ulasan-ecommerce-classification | 2023-09-03T10:56:58.000Z | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:id",
"license:apache-2.0",
"region:us"
] | sekarmulyani | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- text-classification
language:
- id
size_categories:
- 100K<n<1M
--- |
qkrwnstj/anime-captioning-dataset | 2023-09-03T10:47:55.000Z | [
"region:us"
] | qkrwnstj | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7831425.0
num_examples: 20
download_size: 7833024
dataset_size: 7831425.0
---
# Dataset Card for "mid-journey-captioning-dataset-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RWKV__rwkv-raven-3b | 2023-09-06T14:11:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-raven-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-raven-3b](https://huggingface.co/RWKV/rwkv-raven-3b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-raven-3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T10:57:04.203304](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-raven-3b/blob/main/results_2023-09-03T10%3A57%3A04.203304.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25345090777118084,\n\
\ \"acc_stderr\": 0.03146293676978813,\n \"acc_norm\": 0.25664067497637316,\n\
\ \"acc_norm_stderr\": 0.03146743704289715,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871122,\n \"mc2\": 0.35595005980538136,\n\
\ \"mc2_stderr\": 0.013635479123067127\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3310580204778157,\n \"acc_stderr\": 0.013752062419817829,\n\
\ \"acc_norm\": 0.36689419795221845,\n \"acc_norm_stderr\": 0.01408413311810429\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44542919737104164,\n\
\ \"acc_stderr\": 0.004959973514772513,\n \"acc_norm\": 0.5977892850029874,\n\
\ \"acc_norm_stderr\": 0.004893418929918259\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147127,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147127\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.02951928261681724,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.02951928261681724\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868973,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868973\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.02649191472735516,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.02649191472735516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.0449394906861354,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.0449394906861354\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n\
\ \"acc_stderr\": 0.015866243073215037,\n \"acc_norm\": 0.26947637292464877,\n\
\ \"acc_norm_stderr\": 0.015866243073215037\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.01428834380392531,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.01428834380392531\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n\
\ \"acc_stderr\": 0.011176923719313408,\n \"acc_norm\": 0.258148631029987,\n\
\ \"acc_norm_stderr\": 0.011176923719313408\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142317,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871122,\n \"mc2\": 0.35595005980538136,\n\
\ \"mc2_stderr\": 0.013635479123067127\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-raven-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|arc:challenge|25_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hellaswag|10_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:57:04.203304.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T10:57:04.203304.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T10:57:04.203304.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T10:57:04.203304.parquet'
- config_name: results
data_files:
- split: 2023_09_03T10_57_04.203304
path:
- results_2023-09-03T10:57:04.203304.parquet
- split: latest
path:
- results_2023-09-03T10:57:04.203304.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-raven-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-raven-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-raven-3b](https://huggingface.co/RWKV/rwkv-raven-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-raven-3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T10:57:04.203304](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-raven-3b/blob/main/results_2023-09-03T10%3A57%3A04.203304.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25345090777118084,
"acc_stderr": 0.03146293676978813,
"acc_norm": 0.25664067497637316,
"acc_norm_stderr": 0.03146743704289715,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871122,
"mc2": 0.35595005980538136,
"mc2_stderr": 0.013635479123067127
},
"harness|arc:challenge|25": {
"acc": 0.3310580204778157,
"acc_stderr": 0.013752062419817829,
"acc_norm": 0.36689419795221845,
"acc_norm_stderr": 0.01408413311810429
},
"harness|hellaswag|10": {
"acc": 0.44542919737104164,
"acc_stderr": 0.004959973514772513,
"acc_norm": 0.5977892850029874,
"acc_norm_stderr": 0.004893418929918259
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147127,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147127
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.02951928261681724,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.02951928261681724
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868973,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868973
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.02649191472735516,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.02649191472735516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.0449394906861354,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.0449394906861354
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.015866243073215037,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.015866243073215037
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.01428834380392531,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.01428834380392531
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881665,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.011176923719313408,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.011176923719313408
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142317,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871122,
"mc2": 0.35595005980538136,
"mc2_stderr": 0.013635479123067127
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
luisHuang/test_MakeDataset1 | 2023-09-03T11:21:02.000Z | [
"license:openrail",
"region:us"
] | luisHuang | null | null | null | 0 | 0 | ---
license: openrail
---
|
AmelieSchreiber/human_proteins_binding_sites | 2023-09-03T11:26:47.000Z | [
"license:mit",
"region:us"
] | AmelieSchreiber | null | null | null | 0 | 0 | ---
license: mit
---
|
dread1900/AnomalyV1 | 2023-09-03T11:29:46.000Z | [
"region:us"
] | dread1900 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_d | 2023-09-03T11:43:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/black_goo_recipe_d
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/black_goo_recipe_d](https://huggingface.co/KnutJaegersberg/black_goo_recipe_d)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_d\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T11:42:32.382522](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_d/blob/main/results_2023-09-03T11%3A42%3A32.382522.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2716316460871278,\n\
\ \"acc_stderr\": 0.03203316106802326,\n \"acc_norm\": 0.275066453424396,\n\
\ \"acc_norm_stderr\": 0.03203199875985562,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104183,\n \"mc2\": 0.3645987426187387,\n\
\ \"mc2_stderr\": 0.013437572221248745\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.01396014260059869,\n\
\ \"acc_norm\": 0.3779863481228669,\n \"acc_norm_stderr\": 0.014169664520303101\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4879506074487154,\n\
\ \"acc_stderr\": 0.004988332289642083,\n \"acc_norm\": 0.6650069707229636,\n\
\ \"acc_norm_stderr\": 0.004710234188047348\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.03455473702325436,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.03455473702325436\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.03456425745087,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.03456425745087\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261135,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30808080808080807,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.30808080808080807,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.03308818594415753,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.03308818594415753\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02606715922227579,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02606715922227579\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29908256880733947,\n \"acc_stderr\": 0.019630417285415182,\n \"\
acc_norm\": 0.29908256880733947,\n \"acc_norm_stderr\": 0.019630417285415182\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.0432076780753667,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.0432076780753667\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27330779054916987,\n\
\ \"acc_stderr\": 0.015936681062628563,\n \"acc_norm\": 0.27330779054916987,\n\
\ \"acc_norm_stderr\": 0.015936681062628563\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123574,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123574\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.025839898334877976,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.025839898334877976\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n\
\ \"acc_stderr\": 0.011083276280441907,\n \"acc_norm\": 0.2516297262059974,\n\
\ \"acc_norm_stderr\": 0.011083276280441907\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714854,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714854\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23202614379084968,\n \"acc_stderr\": 0.017077373377857013,\n \
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.017077373377857013\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n\
\ \"acc_stderr\": 0.027686913588013003,\n \"acc_norm\": 0.1890547263681592,\n\
\ \"acc_norm_stderr\": 0.027686913588013003\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594687,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594687\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.032744852119469564,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.032744852119469564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104183,\n \"mc2\": 0.3645987426187387,\n\
\ \"mc2_stderr\": 0.013437572221248745\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/black_goo_recipe_d
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|arc:challenge|25_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hellaswag|10_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T11:42:32.382522.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T11:42:32.382522.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T11:42:32.382522.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T11:42:32.382522.parquet'
- config_name: results
data_files:
- split: 2023_09_03T11_42_32.382522
path:
- results_2023-09-03T11:42:32.382522.parquet
- split: latest
path:
- results_2023-09-03T11:42:32.382522.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/black_goo_recipe_d
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/black_goo_recipe_d
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/black_goo_recipe_d](https://huggingface.co/KnutJaegersberg/black_goo_recipe_d) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_d",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T11:42:32.382522](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_d/blob/main/results_2023-09-03T11%3A42%3A32.382522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2716316460871278,
"acc_stderr": 0.03203316106802326,
"acc_norm": 0.275066453424396,
"acc_norm_stderr": 0.03203199875985562,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104183,
"mc2": 0.3645987426187387,
"mc2_stderr": 0.013437572221248745
},
"harness|arc:challenge|25": {
"acc": 0.35238907849829354,
"acc_stderr": 0.01396014260059869,
"acc_norm": 0.3779863481228669,
"acc_norm_stderr": 0.014169664520303101
},
"harness|hellaswag|10": {
"acc": 0.4879506074487154,
"acc_stderr": 0.004988332289642083,
"acc_norm": 0.6650069707229636,
"acc_norm_stderr": 0.004710234188047348
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325436,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325436
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745087,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745087
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261135,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30808080808080807,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.30808080808080807,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.03308818594415753,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.03308818594415753
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02606715922227579,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02606715922227579
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29908256880733947,
"acc_stderr": 0.019630417285415182,
"acc_norm": 0.29908256880733947,
"acc_norm_stderr": 0.019630417285415182
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27330779054916987,
"acc_stderr": 0.015936681062628563,
"acc_norm": 0.27330779054916987,
"acc_norm_stderr": 0.015936681062628563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123574,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123574
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877976,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877976
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441907,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714854,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714854
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.017077373377857013,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.017077373377857013
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013003,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013003
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594687,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594687
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104183,
"mc2": 0.3645987426187387,
"mc2_stderr": 0.013437572221248745
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Henil1/Doctor | 2023-09-03T16:37:28.000Z | [
"region:us"
] | Henil1 | null | null | null | 0 | 0 | Entry not found |
RPBJO/comp | 2023-09-03T11:47:33.000Z | [
"region:us"
] | RPBJO | null | null | null | 0 | 0 | Entry not found |
vhtran/de-en | 2023-09-03T12:00:19.000Z | [
"license:cc-by-4.0",
"region:us"
] | vhtran | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
Asad182/fine-tuning-urdu | 2023-09-03T12:10:58.000Z | [
"region:us"
] | Asad182 | null | null | null | 0 | 0 | Entry not found |
Gummybear05/Y_speed | 2023-09-04T11:39:12.000Z | [
"region:us"
] | Gummybear05 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 2322247497
num_examples: 5400
download_size: 2348923012
dataset_size: 2322247497
---
# Dataset Card for "Y_speed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GFA-D2/pilot_flags | 2023-10-10T07:40:27.000Z | [
"region:us"
] | GFA-D2 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_MiniBooNE_sgosdt_l256_d3_sd0 | 2023-09-04T11:50:22.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 615280000
num_examples: 10000
- name: validation
num_bytes: 615280000
num_examples: 10000
download_size: 1111807343
dataset_size: 1230560000
---
# Dataset Card for "autotree_automl_MiniBooNE_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LauraExp/LILTS | 2023-09-03T12:50:21.000Z | [
"region:us"
] | LauraExp | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3396339.0
num_examples: 2
- name: test
num_bytes: 3396339.0
num_examples: 2
download_size: 0
dataset_size: 6792678.0
---
# Dataset Card for "LILTS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
champkris/cheitraining | 2023-09-03T12:34:06.000Z | [
"region:us"
] | champkris | null | null | null | 0 | 0 | Entry not found |
Jana1994/sv_corpora_parliament_processed | 2023-09-03T12:39:11.000Z | [
"region:us"
] | Jana1994 | null | null | null | 0 | 0 | Entry not found |
LauraExp/LILT2 | 2023-09-03T12:55:30.000Z | [
"region:us"
] | LauraExp | \ |
} | null | 0 | 0 | Entry not found |
eyalshub/gin | 2023-09-03T13:06:29.000Z | [
"region:us"
] | eyalshub | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3 | 2023-09-03T13:18:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/10k_v1_lora_qkvo_rank14_v3](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T13:17:02.987872](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3/blob/main/results_2023-09-03T13%3A17%3A02.987872.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5091352266849982,\n\
\ \"acc_stderr\": 0.03495474191892426,\n \"acc_norm\": 0.5128128131582483,\n\
\ \"acc_norm_stderr\": 0.03493935725866389,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5344202990692574,\n\
\ \"mc2_stderr\": 0.015729161957393895\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.01450676952480424\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6050587532364071,\n\
\ \"acc_stderr\": 0.004878390226591715,\n \"acc_norm\": 0.7921728739294961,\n\
\ \"acc_norm_stderr\": 0.00404923158643323\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982022,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982022\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n\
\ \"acc_stderr\": 0.028327743091561077,\n \"acc_norm\": 0.5451612903225806,\n\
\ \"acc_norm_stderr\": 0.028327743091561077\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\"\
: 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.016328814422102052,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.016328814422102052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409155,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409155\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422704,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n\
\ \"acc_stderr\": 0.012409564470235567,\n \"acc_norm\": 0.3820078226857888,\n\
\ \"acc_norm_stderr\": 0.012409564470235567\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872404,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n\
\ \"acc_stderr\": 0.035256751674679745,\n \"acc_norm\": 0.5373134328358209,\n\
\ \"acc_norm_stderr\": 0.035256751674679745\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5344202990692574,\n\
\ \"mc2_stderr\": 0.015729161957393895\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:17:02.987872.parquet'
- config_name: results
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- results_2023-09-03T13:17:02.987872.parquet
- split: latest
path:
- results_2023-09-03T13:17:02.987872.parquet
---
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank14_v3](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T13:17:02.987872](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3/blob/main/results_2023-09-03T13%3A17%3A02.987872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5091352266849982,
"acc_stderr": 0.03495474191892426,
"acc_norm": 0.5128128131582483,
"acc_norm_stderr": 0.03493935725866389,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5344202990692574,
"mc2_stderr": 0.015729161957393895
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5597269624573379,
"acc_norm_stderr": 0.01450676952480424
},
"harness|hellaswag|10": {
"acc": 0.6050587532364071,
"acc_stderr": 0.004878390226591715,
"acc_norm": 0.7921728739294961,
"acc_norm_stderr": 0.00404923158643323
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982022,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982022
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561077,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561077
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.016328814422102052,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.016328814422102052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409155,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409155
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422704,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235567,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235567
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5373134328358209,
"acc_stderr": 0.035256751674679745,
"acc_norm": 0.5373134328358209,
"acc_norm_stderr": 0.035256751674679745
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5344202990692574,
"mc2_stderr": 0.015729161957393895
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
eyalshub/un-gin | 2023-09-06T10:27:19.000Z | [
"region:us"
] | eyalshub | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2 | 2023-09-03T13:21:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/20k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T13:20:05.284068](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-03T13%3A20%3A05.284068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5085078094647929,\n\
\ \"acc_stderr\": 0.03515476481930117,\n \"acc_norm\": 0.5121639524782741,\n\
\ \"acc_norm_stderr\": 0.035139678425659286,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5157743333677478,\n\
\ \"mc2_stderr\": 0.01586124547215222\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.01459093135812017,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6025692093208525,\n\
\ \"acc_stderr\": 0.004883663587184775,\n \"acc_norm\": 0.7909778928500298,\n\
\ \"acc_norm_stderr\": 0.004057792171893577\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983056,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983056\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535886,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535886\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6313131313131313,\n \"acc_stderr\": 0.034373055019806184,\n \"\
acc_norm\": 0.6313131313131313,\n \"acc_norm_stderr\": 0.034373055019806184\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017845,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017845\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.708256880733945,\n \"acc_stderr\": 0.01948930096887652,\n \"acc_norm\"\
: 0.708256880733945,\n \"acc_norm_stderr\": 0.01948930096887652\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n\
\ \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n\
\ \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.02876034895652341,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.02876034895652341\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.01624608706970141,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.01624608706970141\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.02670054542494368,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.02670054542494368\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n\
\ \"acc_stderr\": 0.014835616582882618,\n \"acc_norm\": 0.26927374301675977,\n\
\ \"acc_norm_stderr\": 0.014835616582882618\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3748370273794003,\n\
\ \"acc_stderr\": 0.012363652467551934,\n \"acc_norm\": 0.3748370273794003,\n\
\ \"acc_norm_stderr\": 0.012363652467551934\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5157743333677478,\n\
\ \"mc2_stderr\": 0.01586124547215222\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:20:05.284068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:20:05.284068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:20:05.284068.parquet'
- config_name: results
data_files:
- split: 2023_09_03T13_20_05.284068
path:
- results_2023-09-03T13:20:05.284068.parquet
- split: latest
path:
- results_2023-09-03T13:20:05.284068.parquet
---
# Dataset Card for Evaluation run of xxyyy123/20k_v1_lora_qkvo_rank14_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/20k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/20k_v1_lora_qkvo_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T13:20:05.284068](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__20k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-03T13%3A20%3A05.284068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5085078094647929,
"acc_stderr": 0.03515476481930117,
"acc_norm": 0.5121639524782741,
"acc_norm_stderr": 0.035139678425659286,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5157743333677478,
"mc2_stderr": 0.01586124547215222
},
"harness|arc:challenge|25": {
"acc": 0.5264505119453925,
"acc_stderr": 0.01459093135812017,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.6025692093208525,
"acc_stderr": 0.004883663587184775,
"acc_norm": 0.7909778928500298,
"acc_norm_stderr": 0.004057792171893577
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983056,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983056
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.023330654054535886,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.023330654054535886
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6313131313131313,
"acc_stderr": 0.034373055019806184,
"acc_norm": 0.6313131313131313,
"acc_norm_stderr": 0.034373055019806184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017845,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017845
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.708256880733945,
"acc_stderr": 0.01948930096887652,
"acc_norm": 0.708256880733945,
"acc_norm_stderr": 0.01948930096887652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334384,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334384
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.02876034895652341,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.02876034895652341
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.01624608706970141,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.01624608706970141
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.02670054542494368,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.02670054542494368
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26927374301675977,
"acc_stderr": 0.014835616582882618,
"acc_norm": 0.26927374301675977,
"acc_norm_stderr": 0.014835616582882618
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759563,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668777,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668777
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3748370273794003,
"acc_stderr": 0.012363652467551934,
"acc_norm": 0.3748370273794003,
"acc_norm_stderr": 0.012363652467551934
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5157743333677478,
"mc2_stderr": 0.01586124547215222
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
linhtran92/infer_on_testds_v1 | 2023-09-03T13:47:31.000Z | [
"region:us"
] | linhtran92 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371437.027
num_examples: 1299
download_size: 164200336
dataset_size: 174371437.027
---
# Dataset Card for "infer_on_testds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile | 2023-09-03T13:56:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RWKV/rwkv-4-14b-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-14b-pile](https://huggingface.co/RWKV/rwkv-4-14b-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T13:55:36.441206](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile/blob/main/results_2023-09-03T13%3A55%3A36.441206.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26786573038872774,\n\
\ \"acc_stderr\": 0.03209427301121615,\n \"acc_norm\": 0.2719684993376118,\n\
\ \"acc_norm_stderr\": 0.0320909328665037,\n \"mc1\": 0.2141982864137087,\n\
\ \"mc1_stderr\": 0.014362148155690466,\n \"mc2\": 0.3204219399194067,\n\
\ \"mc2_stderr\": 0.01314889653839067\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39078498293515357,\n \"acc_stderr\": 0.014258563880513778,\n\
\ \"acc_norm\": 0.4445392491467577,\n \"acc_norm_stderr\": 0.01452122640562708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.522405895239992,\n\
\ \"acc_stderr\": 0.0049847689123269446,\n \"acc_norm\": 0.7107149970125473,\n\
\ \"acc_norm_stderr\": 0.004525037849178837\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.033550453048829226,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.033550453048829226\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741695,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741695\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03718489006818114,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03718489006818114\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2032258064516129,\n \"acc_stderr\": 0.022891687984554952,\n \"\
acc_norm\": 0.2032258064516129,\n \"acc_norm_stderr\": 0.022891687984554952\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.029252823291803624,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.029252823291803624\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463196,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463196\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3119266055045872,\n \"acc_stderr\": 0.019862967976707245,\n \"\
acc_norm\": 0.3119266055045872,\n \"acc_norm_stderr\": 0.019862967976707245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824688,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824688\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695063,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n\
\ \"acc_stderr\": 0.028568079464714277,\n \"acc_norm\": 0.23766816143497757,\n\
\ \"acc_norm_stderr\": 0.028568079464714277\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.045416094465039455,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.045416094465039455\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914404,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.25287356321839083,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.31699346405228757,\n \"acc_stderr\": 0.02664327847450875,\n\
\ \"acc_norm\": 0.31699346405228757,\n \"acc_norm_stderr\": 0.02664327847450875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.025025538500532338,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.025025538500532338\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290396,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n\
\ \"acc_stderr\": 0.011176923719313394,\n \"acc_norm\": 0.258148631029987,\n\
\ \"acc_norm_stderr\": 0.011176923719313394\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.0178831881346672,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.0178831881346672\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2816326530612245,\n \"acc_stderr\": 0.02879518557429127,\n\
\ \"acc_norm\": 0.2816326530612245,\n \"acc_norm_stderr\": 0.02879518557429127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.03115715086935558,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.03115715086935558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2141982864137087,\n\
\ \"mc1_stderr\": 0.014362148155690466,\n \"mc2\": 0.3204219399194067,\n\
\ \"mc2_stderr\": 0.01314889653839067\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-14b-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:55:36.441206.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:55:36.441206.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:55:36.441206.parquet'
- config_name: results
data_files:
- split: 2023_09_03T13_55_36.441206
path:
- results_2023-09-03T13:55:36.441206.parquet
- split: latest
path:
- results_2023-09-03T13:55:36.441206.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-14b-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-14b-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-14b-pile](https://huggingface.co/RWKV/rwkv-4-14b-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T13:55:36.441206](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-14b-pile/blob/main/results_2023-09-03T13%3A55%3A36.441206.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26786573038872774,
"acc_stderr": 0.03209427301121615,
"acc_norm": 0.2719684993376118,
"acc_norm_stderr": 0.0320909328665037,
"mc1": 0.2141982864137087,
"mc1_stderr": 0.014362148155690466,
"mc2": 0.3204219399194067,
"mc2_stderr": 0.01314889653839067
},
"harness|arc:challenge|25": {
"acc": 0.39078498293515357,
"acc_stderr": 0.014258563880513778,
"acc_norm": 0.4445392491467577,
"acc_norm_stderr": 0.01452122640562708
},
"harness|hellaswag|10": {
"acc": 0.522405895239992,
"acc_stderr": 0.0049847689123269446,
"acc_norm": 0.7107149970125473,
"acc_norm_stderr": 0.004525037849178837
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.033550453048829226,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.033550453048829226
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741695,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741695
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745086999,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745086999
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818114,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818114
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2032258064516129,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.2032258064516129,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.029252823291803624,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.029252823291803624
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463196,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463196
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3119266055045872,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.3119266055045872,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824688,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824688
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.028568079464714277,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.028568079464714277
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.045416094465039455,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.045416094465039455
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914404,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961459,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.31699346405228757,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.31699346405228757,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.025025538500532338,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.025025538500532338
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290396,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.011176923719313394,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.011176923719313394
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.0178831881346672,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.0178831881346672
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2816326530612245,
"acc_stderr": 0.02879518557429127,
"acc_norm": 0.2816326530612245,
"acc_norm_stderr": 0.02879518557429127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935558,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2141982864137087,
"mc1_stderr": 0.014362148155690466,
"mc2": 0.3204219399194067,
"mc2_stderr": 0.01314889653839067
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
linhtran92/infer_on_testds_v2 | 2023-09-03T13:58:24.000Z | [
"region:us"
] | linhtran92 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371557.027
num_examples: 1299
download_size: 164199656
dataset_size: 174371557.027
---
# Dataset Card for "infer_on_testds_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sngsfydy/Messidor2_except_0 | 2023-09-03T14:04:20.000Z | [
"region:us"
] | sngsfydy | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
splits:
- name: train
num_bytes: 1381059381.0
num_examples: 727
download_size: 1375867454
dataset_size: 1381059381.0
---
# Dataset Card for "Messidor2_except_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oiramario/VFIformer | 2023-09-03T14:13:30.000Z | [
"region:us"
] | oiramario | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Undi95__ReMM-L2-13B | 2023-09-03T14:16:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Undi95/ReMM-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-L2-13B](https://huggingface.co/Undi95/ReMM-L2-13B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T14:15:27.893202](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B/blob/main/results_2023-09-03T14%3A15%3A27.893202.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5432971544031889,\n\
\ \"acc_stderr\": 0.03447079755702233,\n \"acc_norm\": 0.5470104530937225,\n\
\ \"acc_norm_stderr\": 0.034450698941066726,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.49938332230288074,\n\
\ \"mc2_stderr\": 0.015748300557574715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5716723549488054,\n \"acc_stderr\": 0.014460496367599015,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.63752240589524,\n \
\ \"acc_stderr\": 0.004797332565990075,\n \"acc_norm\": 0.831009759012149,\n\
\ \"acc_norm_stderr\": 0.003739774285418524\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817247,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817247\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251742,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251742\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7343550446998723,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.7343550446998723,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0259924720293064,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0259924720293064\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n\
\ \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n\
\ \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132146,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489906,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489906\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235946,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235946\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.49938332230288074,\n\
\ \"mc2_stderr\": 0.015748300557574715\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|arc:challenge|25_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hellaswag|10_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T14:15:27.893202.parquet'
- config_name: results
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- results_2023-09-03T14:15:27.893202.parquet
- split: latest
path:
- results_2023-09-03T14:15:27.893202.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B](https://huggingface.co/Undi95/ReMM-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T14:15:27.893202](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B/blob/main/results_2023-09-03T14%3A15%3A27.893202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5432971544031889,
"acc_stderr": 0.03447079755702233,
"acc_norm": 0.5470104530937225,
"acc_norm_stderr": 0.034450698941066726,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.49938332230288074,
"mc2_stderr": 0.015748300557574715
},
"harness|arc:challenge|25": {
"acc": 0.5716723549488054,
"acc_stderr": 0.014460496367599015,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.63752240589524,
"acc_stderr": 0.004797332565990075,
"acc_norm": 0.831009759012149,
"acc_norm_stderr": 0.003739774285418524
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817247,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817247
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251742,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251742
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7343550446998723,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.7343550446998723,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0259924720293064,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0259924720293064
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132146,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489906,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235946,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235946
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.49938332230288074,
"mc2_stderr": 0.015748300557574715
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jmgb0127/FronxOwnerManual | 2023-09-03T14:24:43.000Z | [
"region:us"
] | jmgb0127 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1330685
num_examples: 1177
- name: test
num_bytes: 332811
num_examples: 294
download_size: 990561
dataset_size: 1663496
---
# Dataset Card for "FronxOwnerManual"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bimbette/SW_in_film | 2023-09-03T14:26:33.000Z | [
"license:other",
"region:us"
] | bimbette | null | null | null | 0 | 0 | ---
license: other
---
|
bereziat/meteonet | 2023-09-04T08:17:21.000Z | [
"region:us"
] | bereziat | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2 | 2023-09-17T22:28:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/10k_v1_lora_qkvo_rank28_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T22:28:45.139807](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2/blob/main/results_2023-09-17T22-28-45.139807.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.30253775167785235,\n\
\ \"em_stderr\": 0.004704243479116463,\n \"f1\": 0.3736682046979874,\n\
\ \"f1_stderr\": 0.004609071808093349,\n \"acc\": 0.36925201639806293,\n\
\ \"acc_stderr\": 0.007290194379176739\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.30253775167785235,\n \"em_stderr\": 0.004704243479116463,\n\
\ \"f1\": 0.3736682046979874,\n \"f1_stderr\": 0.004609071808093349\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.0021386703014604704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n\
\ }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|arc:challenge|25_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T22_28_45.139807
path:
- '**/details_harness|drop|3_2023-09-17T22-28-45.139807.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T22-28-45.139807.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T22_28_45.139807
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-28-45.139807.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-28-45.139807.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hellaswag|10_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T14:46:59.619219.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T14:46:59.619219.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T22_28_45.139807
path:
- '**/details_harness|winogrande|5_2023-09-17T22-28-45.139807.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T22-28-45.139807.parquet'
- config_name: results
data_files:
- split: 2023_09_03T14_46_59.619219
path:
- results_2023-09-03T14:46:59.619219.parquet
- split: 2023_09_17T22_28_45.139807
path:
- results_2023-09-17T22-28-45.139807.parquet
- split: latest
path:
- results_2023-09-17T22-28-45.139807.parquet
---
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank28_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank28_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank28_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T22:28:45.139807](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank28_v2/blob/main/results_2023-09-17T22-28-45.139807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.30253775167785235,
"em_stderr": 0.004704243479116463,
"f1": 0.3736682046979874,
"f1_stderr": 0.004609071808093349,
"acc": 0.36925201639806293,
"acc_stderr": 0.007290194379176739
},
"harness|drop|3": {
"em": 0.30253775167785235,
"em_stderr": 0.004704243479116463,
"f1": 0.3736682046979874,
"f1_stderr": 0.004609071808093349
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604704
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893009
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BarkAi/Waifu | 2023-09-03T14:47:49.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | BarkAi | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
Jana1994/cym_corpora_parliament_processed | 2023-09-03T15:04:25.000Z | [
"region:us"
] | Jana1994 | null | null | null | 0 | 0 | Entry not found |
xico1024/jm | 2023-09-03T15:21:49.000Z | [
"region:us"
] | xico1024 | null | null | null | 0 | 0 | Entry not found |
Outrun32/road96screenshots | 2023-09-03T15:16:53.000Z | [
"region:us"
] | Outrun32 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2 | 2023-09-03T15:47:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T15:46:18.274387](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T15%3A46%3A18.274387.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5170296348361414,\n\
\ \"acc_stderr\": 0.03493290232216538,\n \"acc_norm\": 0.5207737377982975,\n\
\ \"acc_norm_stderr\": 0.034916919339016556,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729398,\n \"mc2\": 0.5241397415740128,\n\
\ \"mc2_stderr\": 0.0157002252598079\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186043\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6100378410675165,\n\
\ \"acc_stderr\": 0.004867445945277159,\n \"acc_norm\": 0.7959569806811392,\n\
\ \"acc_norm_stderr\": 0.004021769582317863\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n\
\ \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n\
\ \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.033184773338453294,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.033184773338453294\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.03161877917935413,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.03161877917935413\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844075,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5084033613445378,\n \"acc_stderr\": 0.0324739027656967,\n \
\ \"acc_norm\": 0.5084033613445378,\n \"acc_norm_stderr\": 0.0324739027656967\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.01906909836319144,\n \"acc_norm\"\
: 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319144\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548914,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548914\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071669,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071669\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413324,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n\
\ \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n\
\ \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n\
\ \"acc_stderr\": 0.03461199429040013,\n \"acc_norm\": 0.6019900497512438,\n\
\ \"acc_norm_stderr\": 0.03461199429040013\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.03851597683718534,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.03851597683718534\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729398,\n \"mc2\": 0.5241397415740128,\n\
\ \"mc2_stderr\": 0.0157002252598079\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|arc:challenge|25_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hellaswag|10_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T15:46:18.274387.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T15:46:18.274387.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T15:46:18.274387.parquet'
- config_name: results
data_files:
- split: 2023_09_03T15_46_18.274387
path:
- results_2023-09-03T15:46:18.274387.parquet
- split: latest
path:
- results_2023-09-03T15:46:18.274387.parquet
---
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qk_rank14_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/10k_v1_lora_qk_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T15:46:18.274387](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T15%3A46%3A18.274387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5170296348361414,
"acc_stderr": 0.03493290232216538,
"acc_norm": 0.5207737377982975,
"acc_norm_stderr": 0.034916919339016556,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729398,
"mc2": 0.5241397415740128,
"mc2_stderr": 0.0157002252598079
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5648464163822525,
"acc_norm_stderr": 0.014487986197186043
},
"harness|hellaswag|10": {
"acc": 0.6100378410675165,
"acc_stderr": 0.004867445945277159,
"acc_norm": 0.7959569806811392,
"acc_norm_stderr": 0.004021769582317863
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.033184773338453294,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.033184773338453294
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.03161877917935413,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.03161877917935413
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844075,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5084033613445378,
"acc_stderr": 0.0324739027656967,
"acc_norm": 0.5084033613445378,
"acc_norm_stderr": 0.0324739027656967
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.01906909836319144,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.01906909836319144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548914,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548914
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071669,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071669
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701407,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413324,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38396349413298564,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.38396349413298564,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626912,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626912
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.03461199429040013,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.03461199429040013
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718534,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718534
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245229,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245229
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729398,
"mc2": 0.5241397415740128,
"mc2_stderr": 0.0157002252598079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
eara4vu/zsxb | 2023-09-19T05:30:34.000Z | [
"region:us"
] | eara4vu | null | null | null | 0 | 0 | Entry not found |
nostradamus89/1c_code_nano | 2023-09-03T15:52:51.000Z | [
"license:apache-2.0",
"region:us"
] | nostradamus89 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
042A/hako_content | 2023-09-03T16:05:46.000Z | [
"region:us"
] | 042A | null | null | null | 0 | 0 | Entry not found |
cfvd/dondever | 2023-09-03T19:35:52.000Z | [
"region:us"
] | cfvd | null | null | null | 0 | 0 | # Blog-Duggen
Code my blog build with NextJS
|
LittleNeon/GeckyCode | 2023-09-03T17:09:15.000Z | [
"license:unknown",
"region:us"
] | LittleNeon | null | null | null | 0 | 0 | ---
license: unknown
---
|
tanguyrenaudie/alan | 2023-09-03T17:28:12.000Z | [
"region:us"
] | tanguyrenaudie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2440375.0
num_examples: 12
download_size: 2441920
dataset_size: 2440375.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "alan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
muask/mpokjruk | 2023-09-05T04:45:06.000Z | [
"region:us"
] | muask | null | null | null | 0 | 0 | # EpicGames Freebies Claimer

## ⚠️Status⚠️
EpicGames has made captchas mandatory for claiming free games. Currently, epicgames freebies claimer cannot handle this, so [it is not working](https://github.com/Revadike/epicgames-freebies-claimer/issues/172). I am trying to fix it by implementating anti captcha solutions. You can track my progression [here](https://github.com/Revadike/epicgames-freebies-claimer/pull/184). Any help would be greatly appreciated!
## Description
Claim [available free game promotions](https://www.epicgames.com/store/free-games) from the Epic Games Store.
## Requirements
* [DeviceAuthGenerator](https://github.com/jackblk/DeviceAuthGenerator/releases)
* [Git](https://git-scm.com/downloads)
* [Node.js](https://nodejs.org/download/) (with build tools checked)
> Node version >= 15
## Instructions - Quick
0. (Optional) ☆ Star this project :)
1. Download/clone this repository
2. Run `npm install`
3. Generate `data/device_auths.json` (using [DeviceAuthGenerator](https://github.com/jackblk/DeviceAuthGenerator))
4. (Optional) Copy `data/config.example.json` to `data/config.json` and edit it
5. Run `npm start`
## Instructions - Detailed
Check out the [wiki](https://github.com/Revadike/epicgames-freebies-claimer/wiki), written by @lucifudge.
## Instructions - Docker
Check out the [wiki](https://github.com/Revadike/epicgames-freebies-claimer/wiki/User-Guide-(Docker)), written by @jackblk.
## FAQ
### Why should I use this?
This is for the truly lazy, you know who you are. ;)
Also, this is a good alternative, in case you don't like using Epic's client or website (and I don't blame you).
### Why should I even bother claiming these free games?
To which I will say, why not? Most of these games are actually outstanding games! Even if you don't like Epic and their shenanigans, you will be pleased to know that Epic actually funds all the free copies that are given away: ["But we actually found it was more economical to pay developers [a lump sum] to distribute their game free for two weeks..."](https://arstechnica.com/gaming/2019/03/epic-ceo-youre-going-to-see-lower-prices-on-epic-games-store/)
## Changelog
[Full changelog in Wiki](https://github.com/Revadike/epicgames-freebies-claimer/releases)
## Happy Freebie Claiming!

|
tea90210/mltest | 2023-09-04T17:38:44.000Z | [
"region:us"
] | tea90210 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 205326
num_examples: 100
download_size: 115128
dataset_size: 205326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mltest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ccaiccie/fortinet-notes | 2023-09-03T17:49:18.000Z | [
"region:us"
] | ccaiccie | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct | 2023-09-03T17:57:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/quantumairk-llama-2-70B-instruct](https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T17:56:31.707465](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct/blob/main/results_2023-09-03T17%3A56%3A31.707465.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7034684893089378,\n\
\ \"acc_stderr\": 0.03095672218595075,\n \"acc_norm\": 0.7074134061674743,\n\
\ \"acc_norm_stderr\": 0.03092655497572306,\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5442099278190564,\n\
\ \"mc2_stderr\": 0.014507128903598229\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283504,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6787492531368253,\n\
\ \"acc_stderr\": 0.004660025270817022,\n \"acc_norm\": 0.8705437163911571,\n\
\ \"acc_norm_stderr\": 0.003350181812941604\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7584905660377359,\n \"acc_stderr\": 0.026341480371118366,\n\
\ \"acc_norm\": 0.7584905660377359,\n \"acc_norm_stderr\": 0.026341480371118366\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.025680564640056882,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.025680564640056882\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8258064516129032,\n \"acc_stderr\": 0.02157624818451459,\n \"\
acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.02157624818451459\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.02702543349888238,\n \
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.02702543349888238\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n\
\ \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752596,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752596\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8710089399744572,\n\
\ \"acc_stderr\": 0.01198637154808687,\n \"acc_norm\": 0.8710089399744572,\n\
\ \"acc_norm_stderr\": 0.01198637154808687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5988826815642458,\n\
\ \"acc_stderr\": 0.016392221899407075,\n \"acc_norm\": 0.5988826815642458,\n\
\ \"acc_norm_stderr\": 0.016392221899407075\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046112,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093868,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093868\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5886524822695035,\n \"acc_stderr\": 0.02935491115994097,\n \
\ \"acc_norm\": 0.5886524822695035,\n \"acc_norm_stderr\": 0.02935491115994097\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5840938722294654,\n\
\ \"acc_stderr\": 0.0125883238503136,\n \"acc_norm\": 0.5840938722294654,\n\
\ \"acc_norm_stderr\": 0.0125883238503136\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7532679738562091,\n \"acc_stderr\": 0.017440820367402507,\n \
\ \"acc_norm\": 0.7532679738562091,\n \"acc_norm_stderr\": 0.017440820367402507\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.027049257915896175,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.027049257915896175\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160886,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160886\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5442099278190564,\n\
\ \"mc2_stderr\": 0.014507128903598229\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|arc:challenge|25_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hellaswag|10_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T17:56:31.707465.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T17:56:31.707465.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T17:56:31.707465.parquet'
- config_name: results
data_files:
- split: 2023_09_03T17_56_31.707465
path:
- results_2023-09-03T17:56:31.707465.parquet
- split: latest
path:
- results_2023-09-03T17:56:31.707465.parquet
---
# Dataset Card for Evaluation run of quantumaikr/quantumairk-llama-2-70B-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/quantumairk-llama-2-70B-instruct](https://huggingface.co/quantumaikr/quantumairk-llama-2-70B-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T17:56:31.707465](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantumairk-llama-2-70B-instruct/blob/main/results_2023-09-03T17%3A56%3A31.707465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7034684893089378,
"acc_stderr": 0.03095672218595075,
"acc_norm": 0.7074134061674743,
"acc_norm_stderr": 0.03092655497572306,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5442099278190564,
"mc2_stderr": 0.014507128903598229
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283504,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725225
},
"harness|hellaswag|10": {
"acc": 0.6787492531368253,
"acc_stderr": 0.004660025270817022,
"acc_norm": 0.8705437163911571,
"acc_norm_stderr": 0.003350181812941604
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7584905660377359,
"acc_stderr": 0.026341480371118366,
"acc_norm": 0.7584905660377359,
"acc_norm_stderr": 0.026341480371118366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.02157624818451459,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.02157624818451459
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02548549837334323,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02548549837334323
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.02702543349888238,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.02702543349888238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752596,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752596
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8710089399744572,
"acc_stderr": 0.01198637154808687,
"acc_norm": 0.8710089399744572,
"acc_norm_stderr": 0.01198637154808687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5988826815642458,
"acc_stderr": 0.016392221899407075,
"acc_norm": 0.5988826815642458,
"acc_norm_stderr": 0.016392221899407075
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046112,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093868,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093868
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5886524822695035,
"acc_stderr": 0.02935491115994097,
"acc_norm": 0.5886524822695035,
"acc_norm_stderr": 0.02935491115994097
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5840938722294654,
"acc_stderr": 0.0125883238503136,
"acc_norm": 0.5840938722294654,
"acc_norm_stderr": 0.0125883238503136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7532679738562091,
"acc_stderr": 0.017440820367402507,
"acc_norm": 0.7532679738562091,
"acc_norm_stderr": 0.017440820367402507
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.027049257915896175,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.027049257915896175
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018515,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018515
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160886,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160886
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5442099278190564,
"mc2_stderr": 0.014507128903598229
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
stefan-it/autotrain-flair-hipe2022-de-hmbert | 2023-09-04T00:08:49.000Z | [
"region:us"
] | stefan-it | null | null | null | 0 | 0 | # NER Fine-Tuning
We use Flair for fine-tuning NER models on
[HIPE-2022](https://github.com/hipe-eval/HIPE-2022-data) datasets from
[HIPE-2022 Shared Task](https://hipe-eval.github.io/HIPE-2022/).
All models are fine-tuned on A10 (24GB) and A100 (40GB) instances from
[Lambda Cloud](https://lambdalabs.com/service/gpu-cloud) using Flair:
```bash
$ git clone https://github.com/flairNLP/flair.git
$ cd flair && git checkout 419f13a05d6b36b2a42dd73a551dc3ba679f820c
$ pip3 install -e .
$ cd ..
```
Clone this repo for fine-tuning NER models:
```bash
$ git clone https://github.com/stefan-it/hmTEAMS.git
$ cd hmTEAMS/bench
```
Authorize via Hugging Face CLI (needed because hmTEAMS is currently only available after approval):
```bash
# Use access token from https://huggingface.co/settings/tokens
$ huggingface-cli login login
```
We use a config-driven hyper-parameter search. The script [`flair-fine-tuner.py`](flair-fine-tuner.py) can be used to
fine-tune NER models from our Model Zoo.
# Benchmark
We test our pretrained language models on various datasets from HIPE-2020, HIPE-2022 and Europeana. The following table
shows an overview of used datasets.
| Language | Datasets
|----------|----------------------------------------------------|
| English | [AjMC] - [TopRes19th] |
| German | [AjMC] - [NewsEye] |
| French | [AjMC] - [ICDAR-Europeana] - [LeTemps] - [NewsEye] |
| Finnish | [NewsEye] |
| Swedish | [NewsEye] |
| Dutch | [ICDAR-Europeana] |
[AjMC]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-ajmc.md
[NewsEye]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-newseye.md
[TopRes19th]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-topres19th.md
[ICDAR-Europeana]: https://github.com/stefan-it/historic-domain-adaptation-icdar
[LeTemps]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-letemps.md
# Results
We report averaged F1-score over 5 runs with different seeds on development set:
| Model | English AjMC | German AjMC | French AjMC | German NewsEye | French NewsEye | Finnish NewsEye | Swedish NewsEye | Dutch ICDAR | French ICDAR | French LeTemps | English TopRes19th | Avg. |
|---------------------------------------------------------------------------|--------------|--------------|--------------|----------------|----------------|-----------------|-----------------|--------------|--------------|----------------|--------------------|-----------|
| hmBERT (32k) [Schweter et al.](https://ceur-ws.org/Vol-3180/paper-87.pdf) | 85.36 ± 0.94 | 89.08 ± 0.09 | 85.10 ± 0.60 | 39.65 ± 1.01 | 81.47 ± 0.36 | 77.28 ± 0.37 | 82.85 ± 0.83 | 82.11 ± 0.61 | 77.21 ± 0.16 | 65.73 ± 0.56 | 80.94 ± 0.86 | 76.98 |
| hmTEAMS (Ours) | 86.41 ± 0.36 | 88.64 ± 0.42 | 85.41 ± 0.67 | 41.51 ± 2.82 | 83.20 ± 0.79 | 79.27 ± 1.88 | 82.78 ± 0.60 | 88.21 ± 0.39 | 78.03 ± 0.39 | 66.71 ± 0.46 | 81.36 ± 0.59 | **78.32** |
|
minfeng-ai/leetcode_preference | 2023-09-06T01:08:25.000Z | [
"arxiv:2305.18290",
"region:us"
] | minfeng-ai | null | null | null | 1 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for LeetCode Preference
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset facilitates experiments utilizing Direct Preference Optimization (DPO) as outlined in the paper titled [Direct Preference Optimization: Your Language Model is Secretly a Reward Model](https://arxiv.org/abs/2305.18290). This repository provides code pairings crafted by CodeLLaMA-7b. For every LeetCode question posed, CodeLLaMA-7b produces two unique solutions. These are subsequently evaluated and ranked by human experts based on their accuracy, efficiency, and readability.
### Usage
```
from datasets import load_dataset
dataset = load_dataset("minfeng-ai/leetcode_preference")
```
### Data Fields
Each row of the dataset contains the following fields:
* id: A distinct identifier assigned to each LeetCode question.
* title: The official title of the LeetCode question.
* description: An in-depth prompt offering detailed insights into the respective question.
* difficulty: Categorized into three tiers, indicating the complexity of the question - Easy, Medium, and Hard.
* version1: The initial AI-generated code snippet pertaining to the question.
* version2: A secondary AI-generated code snippet related to the same question.
* preference: A human-assessed label indicating the preferred code snippet between Version1 and Version2. |
open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 | 2023-09-17T16:41:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T16:41:24.154084](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-17T16-41-24.154084.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.272126677852349,\n\
\ \"em_stderr\": 0.004557777416899833,\n \"f1\": 0.34851929530201453,\n\
\ \"f1_stderr\": 0.004500795514577557,\n \"acc\": 0.36653625926220684,\n\
\ \"acc_stderr\": 0.007090302750388251\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.272126677852349,\n \"em_stderr\": 0.004557777416899833,\n\
\ \"f1\": 0.34851929530201453,\n \"f1_stderr\": 0.004500795514577557\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401502038\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626299\n\
\ }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|arc:challenge|25_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T16_41_24.154084
path:
- '**/details_harness|drop|3_2023-09-17T16-41-24.154084.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T16-41-24.154084.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T16_41_24.154084
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-41-24.154084.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-41-24.154084.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hellaswag|10_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T18:33:19.019825.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T18:33:19.019825.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T16_41_24.154084
path:
- '**/details_harness|winogrande|5_2023-09-17T16-41-24.154084.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T16-41-24.154084.parquet'
- config_name: results
data_files:
- split: 2023_09_03T18_33_19.019825
path:
- results_2023-09-03T18:33:19.019825.parquet
- split: 2023_09_17T16_41_24.154084
path:
- results_2023-09-17T16-41-24.154084.parquet
- split: latest
path:
- results_2023-09-17T16-41-24.154084.parquet
---
# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T16:41:24.154084](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2/blob/main/results_2023-09-17T16-41-24.154084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.272126677852349,
"em_stderr": 0.004557777416899833,
"f1": 0.34851929530201453,
"f1_stderr": 0.004500795514577557,
"acc": 0.36653625926220684,
"acc_stderr": 0.007090302750388251
},
"harness|drop|3": {
"em": 0.272126677852349,
"em_stderr": 0.004557777416899833,
"f1": 0.34851929530201453,
"f1_stderr": 0.004500795514577557
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401502038
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626299
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2 | 2023-09-03T18:42:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T18:41:04.280567](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T18%3A41%3A04.280567.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5159772470651705,\n\
\ \"acc_stderr\": 0.03490050368845693,\n \"acc_norm\": 0.5196198874675843,\n\
\ \"acc_norm_stderr\": 0.03488383911166199,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n\
\ \"mc2_stderr\": 0.015788699144390992\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539982,\n\
\ \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n\
\ \"acc_stderr\": 0.004860162076330978,\n \"acc_norm\": 0.8008364867556264,\n\
\ \"acc_norm_stderr\": 0.0039855506403304606\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317216,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317216\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871623,\n \"\
acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871623\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196704,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196704\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.01622501794477098,\n\
\ \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.01622501794477098\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5809248554913294,\n\
\ \"acc_stderr\": 0.02656417811142262,\n \"acc_norm\": 0.5809248554913294,\n\
\ \"acc_norm_stderr\": 0.02656417811142262\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n\
\ \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5261437908496732,\n\
\ \"acc_stderr\": 0.028590752958852394,\n \"acc_norm\": 0.5261437908496732,\n\
\ \"acc_norm_stderr\": 0.028590752958852394\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.027950481494401266,\n\
\ \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.027950481494401266\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.027586006221607708,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.027586006221607708\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n\
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n\
\ \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n\
\ \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n\
\ \"mc2_stderr\": 0.015788699144390992\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|arc:challenge|25_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hellaswag|10_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T18:41:04.280567.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T18:41:04.280567.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T18:41:04.280567.parquet'
- config_name: results
data_files:
- split: 2023_09_03T18_41_04.280567
path:
- results_2023-09-03T18:41:04.280567.parquet
- split: latest
path:
- results_2023-09-03T18:41:04.280567.parquet
---
# Dataset Card for Evaluation run of xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2](https://huggingface.co/xxyyy123/mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T18:41:04.280567](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qk_rank14_v2/blob/main/results_2023-09-03T18%3A41%3A04.280567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5159772470651705,
"acc_stderr": 0.03490050368845693,
"acc_norm": 0.5196198874675843,
"acc_norm_stderr": 0.03488383911166199,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5084843623108531,
"mc2_stderr": 0.015788699144390992
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.5810580204778157,
"acc_norm_stderr": 0.014418106953639013
},
"harness|hellaswag|10": {
"acc": 0.6132244572794264,
"acc_stderr": 0.004860162076330978,
"acc_norm": 0.8008364867556264,
"acc_norm_stderr": 0.0039855506403304606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317216,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317216
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700286,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700286
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.02533466708095495,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.02533466708095495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871623,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871623
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112723,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196704,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196704
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477098,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477098
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5809248554913294,
"acc_stderr": 0.02656417811142262,
"acc_norm": 0.5809248554913294,
"acc_norm_stderr": 0.02656417811142262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607708,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38396349413298564,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.38396349413298564,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626912,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626912
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245229,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245229
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5084843623108531,
"mc2_stderr": 0.015788699144390992
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Slichi/croera | 2023-09-03T19:04:01.000Z | [
"license:openrail",
"region:us"
] | Slichi | null | null | null | 0 | 0 | ---
license: openrail
---
|
solomars/solo123 | 2023-09-03T19:08:41.000Z | [
"region:us"
] | solomars | null | null | null | 0 | 0 | git lfs install
git clone https://huggingface.co/redstonehero/epicphotogasm_v1 |
jfloquet/dataset | 2023-09-03T19:11:10.000Z | [
"region:us"
] | jfloquet | null | null | null | 0 | 0 | Entry not found |
Saksham10025/Plank | 2023-09-03T19:15:57.000Z | [
"region:us"
] | Saksham10025 | null | null | null | 0 | 0 | Entry not found |
gaodrew/thera-1250 | 2023-09-03T19:59:14.000Z | [
"region:us"
] | gaodrew | null | null | null | 0 | 0 | Entry not found |
AkikJana/parsed_data | 2023-09-04T09:31:12.000Z | [
"region:us"
] | AkikJana | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA | 2023-09-03T20:11:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/LLongMA-3b-LIMA](https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T20:09:53.352642](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA/blob/main/results_2023-09-03T20%3A09%3A53.352642.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2698380429406577,\n\
\ \"acc_stderr\": 0.03212683690771179,\n \"acc_norm\": 0.2733784546368821,\n\
\ \"acc_norm_stderr\": 0.03212503812423912,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.34711273270028703,\n\
\ \"mc2_stderr\": 0.013476356012838524\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3643344709897611,\n \"acc_stderr\": 0.014063260279882417,\n\
\ \"acc_norm\": 0.39078498293515357,\n \"acc_norm_stderr\": 0.014258563880513778\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48904600677155946,\n\
\ \"acc_stderr\": 0.0049885838203099185,\n \"acc_norm\": 0.6714797849034057,\n\
\ \"acc_norm_stderr\": 0.004687151994791093\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.03456425745087001,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.03456425745087001\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102967,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102967\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.038783523721386215,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.038783523721386215\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.030031147977641545,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.030031147977641545\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.022211106810061658,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.022211106810061658\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25137614678899084,\n\
\ \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.25137614678899084,\n\
\ \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n\
\ \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224595,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752597,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752597\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n\
\ \"acc_stderr\": 0.016267000684598652,\n \"acc_norm\": 0.29246487867177523,\n\
\ \"acc_norm_stderr\": 0.016267000684598652\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.026664410886937624,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.026664410886937624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460987,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460987\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2646675358539765,\n\
\ \"acc_stderr\": 0.011267332992845535,\n \"acc_norm\": 0.2646675358539765,\n\
\ \"acc_norm_stderr\": 0.011267332992845535\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.26838235294117646,\n \"acc_stderr\": 0.026917481224377232,\n\
\ \"acc_norm\": 0.26838235294117646,\n \"acc_norm_stderr\": 0.026917481224377232\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.02635891633490403,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.02635891633490403\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328923,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328923\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.0356507967070831,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.0356507967070831\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.34711273270028703,\n\
\ \"mc2_stderr\": 0.013476356012838524\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|arc:challenge|25_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hellaswag|10_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T20:09:53.352642.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T20:09:53.352642.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T20:09:53.352642.parquet'
- config_name: results
data_files:
- split: 2023_09_03T20_09_53.352642
path:
- results_2023-09-03T20:09:53.352642.parquet
- split: latest
path:
- results_2023-09-03T20:09:53.352642.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/LLongMA-3b-LIMA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/LLongMA-3b-LIMA](https://huggingface.co/KnutJaegersberg/LLongMA-3b-LIMA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T20:09:53.352642](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__LLongMA-3b-LIMA/blob/main/results_2023-09-03T20%3A09%3A53.352642.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2698380429406577,
"acc_stderr": 0.03212683690771179,
"acc_norm": 0.2733784546368821,
"acc_norm_stderr": 0.03212503812423912,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.34711273270028703,
"mc2_stderr": 0.013476356012838524
},
"harness|arc:challenge|25": {
"acc": 0.3643344709897611,
"acc_stderr": 0.014063260279882417,
"acc_norm": 0.39078498293515357,
"acc_norm_stderr": 0.014258563880513778
},
"harness|hellaswag|10": {
"acc": 0.48904600677155946,
"acc_stderr": 0.0049885838203099185,
"acc_norm": 0.6714797849034057,
"acc_norm_stderr": 0.004687151994791093
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745087001,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745087001
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102967,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102967
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.038783523721386215,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.038783523721386215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.030031147977641545,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.030031147977641545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.022211106810061658,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.022211106810061658
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224595,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752597,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752597
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29246487867177523,
"acc_stderr": 0.016267000684598652,
"acc_norm": 0.29246487867177523,
"acc_norm_stderr": 0.016267000684598652
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.026664410886937624,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.026664410886937624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460987,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460987
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2646675358539765,
"acc_stderr": 0.011267332992845535,
"acc_norm": 0.2646675358539765,
"acc_norm_stderr": 0.011267332992845535
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.26838235294117646,
"acc_stderr": 0.026917481224377232,
"acc_norm": 0.26838235294117646,
"acc_norm_stderr": 0.026917481224377232
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.02635891633490403,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.02635891633490403
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328923,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328923
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.0356507967070831,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.0356507967070831
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.34711273270028703,
"mc2_stderr": 0.013476356012838524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Moghazy/xyz_321 | 2023-09-03T20:12:23.000Z | [
"region:us"
] | Moghazy | null | null | null | 0 | 0 | Entry not found |
zares/wiki.tr.txt | 2023-09-03T20:29:15.000Z | [
"license:other",
"region:us"
] | zares | null | null | null | 0 | 0 | ---
license: other
---
|
kpola009/cashq100 | 2023-09-03T20:40:36.000Z | [
"license:apache-2.0",
"region:us"
] | kpola009 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
CommunistCowGod/quasarcake | 2023-09-03T20:38:48.000Z | [
"license:openrail",
"region:us"
] | CommunistCowGod | null | null | null | 0 | 0 | ---
license: openrail
---
|
marasama/nva-Nurnberg | 2023-09-03T21:04:58.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
ducuwu/barna | 2023-09-03T20:52:29.000Z | [
"region:us"
] | ducuwu | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B | 2023-09-22T18:47:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/LLaMa-2-PeanutButter_v18_B-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T18:47:12.642745](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B/blob/main/results_2023-09-22T18-47-12.642745.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0050335570469798654,\n\
\ \"em_stderr\": 0.0007247385547751907,\n \"f1\": 0.060973154362416224,\n\
\ \"f1_stderr\": 0.0014562854103949273,\n \"acc\": 0.40513399869433026,\n\
\ \"acc_stderr\": 0.009524554979348756\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751907,\n\
\ \"f1\": 0.060973154362416224,\n \"f1_stderr\": 0.0014562854103949273\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06520090978013647,\n \
\ \"acc_stderr\": 0.006800302989321091\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|arc:challenge|25_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T18_47_12.642745
path:
- '**/details_harness|drop|3_2023-09-22T18-47-12.642745.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T18-47-12.642745.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T18_47_12.642745
path:
- '**/details_harness|gsm8k|5_2023-09-22T18-47-12.642745.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T18-47-12.642745.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hellaswag|10_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T22:06:17.603163.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T22:06:17.603163.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T18_47_12.642745
path:
- '**/details_harness|winogrande|5_2023-09-22T18-47-12.642745.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T18-47-12.642745.parquet'
- config_name: results
data_files:
- split: 2023_09_03T22_06_17.603163
path:
- results_2023-09-03T22:06:17.603163.parquet
- split: 2023_09_22T18_47_12.642745
path:
- results_2023-09-22T18-47-12.642745.parquet
- split: latest
path:
- results_2023-09-22T18-47-12.642745.parquet
---
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v18_B-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T18:47:12.642745](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B/blob/main/results_2023-09-22T18-47-12.642745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0050335570469798654,
"em_stderr": 0.0007247385547751907,
"f1": 0.060973154362416224,
"f1_stderr": 0.0014562854103949273,
"acc": 0.40513399869433026,
"acc_stderr": 0.009524554979348756
},
"harness|drop|3": {
"em": 0.0050335570469798654,
"em_stderr": 0.0007247385547751907,
"f1": 0.060973154362416224,
"f1_stderr": 0.0014562854103949273
},
"harness|gsm8k|5": {
"acc": 0.06520090978013647,
"acc_stderr": 0.006800302989321091
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
profetize/kirsten_v5 | 2023-09-03T22:09:47.000Z | [
"region:us"
] | profetize | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: Filename
dtype: string
- name: URL
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 243360307.832761
num_examples: 10482
- name: test
num_bytes: 81143319.5836195
num_examples: 3495
- name: validate
num_bytes: 81143319.5836195
num_examples: 3495
download_size: 237943569
dataset_size: 405646947.0
---
# Dataset Card for "kirsten_v5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kgiamalis/Llama-2-train | 2023-09-03T22:58:08.000Z | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | kgiamalis | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
|
YoungPhlo/juyongjiang-codeup_master_data_new_codealpaca_standardized | 2023-09-09T02:03:22.000Z | [
"region:us"
] | YoungPhlo | null | null | null | 0 | 0 | Entry not found |
dvbviana/audio | 2023-09-03T23:02:32.000Z | [
"region:us"
] | dvbviana | null | null | null | 0 | 0 | Entry not found |
volvoDon/necronomicon | 2023-09-04T02:28:27.000Z | [
"license:apache-2.0",
"region:us"
] | volvoDon | null | null | null | 0 | 0 | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 178080
num_examples: 1
download_size: 0
dataset_size: 178080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
noe-zaabi/LLM-Science-Standardized | 2023-09-03T23:54:32.000Z | [
"license:mit",
"region:us"
] | noe-zaabi | null | null | null | 0 | 0 | ---
license: mit
---
|
marasama/nva-Rahul_Love | 2023-09-04T00:04:16.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
volvoDon/mr-golem | 2023-09-04T00:06:47.000Z | [
"region:us"
] | volvoDon | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 155724.0
num_examples: 19
- name: test
num_bytes: 24588.0
num_examples: 3
download_size: 103142
dataset_size: 180312.0
---
# Dataset Card for "mr-golem"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MouseTrap/maow-maow-dataset-v2-5 | 2023-09-04T00:32:08.000Z | [
"region:us"
] | MouseTrap | null | null | null | 0 | 0 | Entry not found |
suoixonbor/hf_dataset_repo | 2023-09-04T01:35:46.000Z | [
"region:us"
] | suoixonbor | null | null | null | 0 | 0 | Entry not found |
miazhao/prm800k_rating_cls | 2023-09-04T01:54:58.000Z | [
"region:us"
] | miazhao | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: rating
dtype: int64
splits:
- name: train
num_bytes: 649475535
num_examples: 801063
download_size: 94263081
dataset_size: 649475535
---
# Dataset Card for "prm800k_rating_cls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bernardo-de-almeida/Test | 2023-09-05T16:32:46.000Z | [
"region:us"
] | bernardo-de-almeida | null | null | null | 0 | 0 | Entry not found |
Miosdream/vits2 | 2023-09-04T03:02:08.000Z | [
"license:openrail",
"region:us"
] | Miosdream | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj | 2023-09-04T02:21:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-04T02:19:44.261303](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj/blob/main/results_2023-09-04T02%3A19%3A44.261303.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5464081364607037,\n\
\ \"acc_stderr\": 0.03441125479940618,\n \"acc_norm\": 0.550630815969306,\n\
\ \"acc_norm_stderr\": 0.03439157848845273,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.38643080121718054,\n\
\ \"mc2_stderr\": 0.01402367136852026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627077,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6032662816172077,\n\
\ \"acc_stderr\": 0.004882200364432368,\n \"acc_norm\": 0.8105954989046007,\n\
\ \"acc_norm_stderr\": 0.003910288117015167\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490438,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490438\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649038,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649038\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860695,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860695\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.02917868230484253,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.02917868230484253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n\
\ \"acc_stderr\": 0.015162024152278448,\n \"acc_norm\": 0.7650063856960408,\n\
\ \"acc_norm_stderr\": 0.015162024152278448\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.014987325439963547,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.014987325439963547\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\
\ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\
\ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.02010986454718136,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.02010986454718136\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.38643080121718054,\n\
\ \"mc2_stderr\": 0.01402367136852026\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:19:44.261303.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:19:44.261303.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:19:44.261303.parquet'
- config_name: results
data_files:
- split: 2023_09_04T02_19_44.261303
path:
- results_2023-09-04T02:19:44.261303.parquet
- split: latest
path:
- results_2023-09-04T02:19:44.261303.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-04T02:19:44.261303](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w-q_k_v_o_proj/blob/main/results_2023-09-04T02%3A19%3A44.261303.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5464081364607037,
"acc_stderr": 0.03441125479940618,
"acc_norm": 0.550630815969306,
"acc_norm_stderr": 0.03439157848845273,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.38643080121718054,
"mc2_stderr": 0.01402367136852026
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627077,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.6032662816172077,
"acc_stderr": 0.004882200364432368,
"acc_norm": 0.8105954989046007,
"acc_norm_stderr": 0.003910288117015167
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490438,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490438
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649038,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649038
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860695,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860695
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080445,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.02917868230484253,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.02917868230484253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278448,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278448
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806642,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806642
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963547,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.02010986454718136,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.02010986454718136
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.38643080121718054,
"mc2_stderr": 0.01402367136852026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fisheryuzy/test_datasets | 2023-09-04T02:23:22.000Z | [
"region:us"
] | fisheryuzy | null | null | null | 0 | 0 | Entry not found |
volvoDon/big-golem | 2023-09-04T02:28:31.000Z | [
"region:us"
] | volvoDon | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 155724.0
num_examples: 19
- name: test
num_bytes: 24588.0
num_examples: 3
download_size: 102998
dataset_size: 180312.0
---
# Dataset Card for "big-golem"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7 | 2023-09-04T02:39:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6832397060915553,\n\
\ \"acc_stderr\": 0.031693477754770626,\n \"acc_norm\": 0.6869592578044069,\n\
\ \"acc_norm_stderr\": 0.03166529474407705,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n\
\ \"mc2_stderr\": 0.01502146266727205\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6818362875921131,\n\
\ \"acc_stderr\": 0.004648115322328777,\n \"acc_norm\": 0.873132842063334,\n\
\ \"acc_norm_stderr\": 0.0033214390244115494\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534422,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534422\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080438,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878456,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878456\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5910614525139665,\n\
\ \"acc_stderr\": 0.016442830654715548,\n \"acc_norm\": 0.5910614525139665,\n\
\ \"acc_norm_stderr\": 0.016442830654715548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398195,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398195\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898968,\n\
\ \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n\
\ \"mc2_stderr\": 0.01502146266727205\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet'
- config_name: results
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- results_2023-09-04T02:38:01.038212.parquet
- split: latest
path:
- results_2023-09-04T02:38:01.038212.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6832397060915553,
"acc_stderr": 0.031693477754770626,
"acc_norm": 0.6869592578044069,
"acc_norm_stderr": 0.03166529474407705,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986214,
"mc2": 0.6310264033909807,
"mc2_stderr": 0.01502146266727205
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.01368814730972912,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725227
},
"harness|hellaswag|10": {
"acc": 0.6818362875921131,
"acc_stderr": 0.004648115322328777,
"acc_norm": 0.873132842063334,
"acc_norm_stderr": 0.0033214390244115494
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534422,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534422
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080438,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878456,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878456
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5910614525139665,
"acc_stderr": 0.016442830654715548,
"acc_norm": 0.5910614525139665,
"acc_norm_stderr": 0.016442830654715548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398195,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398195
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8024691358024691,
"acc_stderr": 0.022152889927898968,
"acc_norm": 0.8024691358024691,
"acc_norm_stderr": 0.022152889927898968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986214,
"mc2": 0.6310264033909807,
"mc2_stderr": 0.01502146266727205
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MouseTrap/maow-maow-dataset-v3 | 2023-09-04T02:47:11.000Z | [
"region:us"
] | MouseTrap | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public | 2023-09-04T02:47:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of porkorbeef/Llama-2-13b-public
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [porkorbeef/Llama-2-13b-public](https://huggingface.co/porkorbeef/Llama-2-13b-public)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-04T02:45:47.354690](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public/blob/main/results_2023-09-04T02%3A45%3A47.354690.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2282108854679419,\n\
\ \"acc_stderr\": 0.030413540158606617,\n \"acc_norm\": 0.2292442269285721,\n\
\ \"acc_norm_stderr\": 0.030427654743837382,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062135,\n \"mc2\": 0.49010098271988395,\n\
\ \"mc2_stderr\": 0.01677407134302731\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.24658703071672355,\n \"acc_stderr\": 0.012595726268790124,\n\
\ \"acc_norm\": 0.29948805460750855,\n \"acc_norm_stderr\": 0.013385021637313567\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2584146584345748,\n\
\ \"acc_stderr\": 0.004368684255626186,\n \"acc_norm\": 0.2664807807209719,\n\
\ \"acc_norm_stderr\": 0.004412149415717919\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.19245283018867926,\n \"acc_stderr\": 0.024262979839372277,\n\
\ \"acc_norm\": 0.19245283018867926,\n \"acc_norm_stderr\": 0.024262979839372277\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n\
\ \"acc_stderr\": 0.029605623981771224,\n \"acc_norm\": 0.18497109826589594,\n\
\ \"acc_norm_stderr\": 0.029605623981771224\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386705,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386705\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217886,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217886\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733545,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.15544041450777202,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.15544041450777202,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.17435897435897435,\n \"acc_stderr\": 0.01923724980340523,\n\
\ \"acc_norm\": 0.17435897435897435,\n \"acc_norm_stderr\": 0.01923724980340523\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.024762902678057933,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.024762902678057933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.02596742095825853,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.02596742095825853\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842555,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842555\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874975,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874975\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.13740458015267176,\n \"acc_stderr\": 0.030194823996804468,\n\
\ \"acc_norm\": 0.13740458015267176,\n \"acc_norm_stderr\": 0.030194823996804468\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.20245398773006135,\n \"acc_stderr\": 0.03157065078911902,\n\
\ \"acc_norm\": 0.20245398773006135,\n \"acc_norm_stderr\": 0.03157065078911902\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.01549108895149458,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.01549108895149458\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210761,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210761\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934102,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21543408360128619,\n\
\ \"acc_stderr\": 0.02335022547547143,\n \"acc_norm\": 0.21543408360128619,\n\
\ \"acc_norm_stderr\": 0.02335022547547143\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045519,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045519\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612379,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062135,\n \"mc2\": 0.49010098271988395,\n\
\ \"mc2_stderr\": 0.01677407134302731\n }\n}\n```"
repo_url: https://huggingface.co/porkorbeef/Llama-2-13b-public
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:45:47.354690.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:45:47.354690.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:45:47.354690.parquet'
- config_name: results
data_files:
- split: 2023_09_04T02_45_47.354690
path:
- results_2023-09-04T02:45:47.354690.parquet
- split: latest
path:
- results_2023-09-04T02:45:47.354690.parquet
---
# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-public
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/porkorbeef/Llama-2-13b-public
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [porkorbeef/Llama-2-13b-public](https://huggingface.co/porkorbeef/Llama-2-13b-public) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-04T02:45:47.354690](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public/blob/main/results_2023-09-04T02%3A45%3A47.354690.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2282108854679419,
"acc_stderr": 0.030413540158606617,
"acc_norm": 0.2292442269285721,
"acc_norm_stderr": 0.030427654743837382,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062135,
"mc2": 0.49010098271988395,
"mc2_stderr": 0.01677407134302731
},
"harness|arc:challenge|25": {
"acc": 0.24658703071672355,
"acc_stderr": 0.012595726268790124,
"acc_norm": 0.29948805460750855,
"acc_norm_stderr": 0.013385021637313567
},
"harness|hellaswag|10": {
"acc": 0.2584146584345748,
"acc_stderr": 0.004368684255626186,
"acc_norm": 0.2664807807209719,
"acc_norm_stderr": 0.004412149415717919
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19245283018867926,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.19245283018867926,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.029605623981771224,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.029605623981771224
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386705,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386705
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217886,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217886
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.2,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.15544041450777202,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.15544041450777202,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.17435897435897435,
"acc_stderr": 0.01923724980340523,
"acc_norm": 0.17435897435897435,
"acc_norm_stderr": 0.01923724980340523
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.024762902678057933,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.024762902678057933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.02596742095825853,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.02596742095825853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842555,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842555
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874975,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874975
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.13740458015267176,
"acc_stderr": 0.030194823996804468,
"acc_norm": 0.13740458015267176,
"acc_norm_stderr": 0.030194823996804468
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.20245398773006135,
"acc_stderr": 0.03157065078911902,
"acc_norm": 0.20245398773006135,
"acc_norm_stderr": 0.03157065078911902
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.01549108895149458,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.01549108895149458
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210761,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210761
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934102,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21543408360128619,
"acc_stderr": 0.02335022547547143,
"acc_norm": 0.21543408360128619,
"acc_norm_stderr": 0.02335022547547143
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.0257700156442904,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.0257700156442904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045519,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612379,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2653061224489796,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.2653061224489796,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062135,
"mc2": 0.49010098271988395,
"mc2_stderr": 0.01677407134302731
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jlkj/the-stack-moonscript-clean | 2023-09-04T13:20:39.000Z | [
"region:us"
] | jlkj | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 20211229.498295177
num_examples: 5520
- name: test
num_bytes: 1124066.5681117065
num_examples: 307
- name: valid
num_bytes: 1124066.5681117065
num_examples: 307
download_size: 9160074
dataset_size: 22459362.63451859
---
# Dataset Card for "the-stack-moonscript-clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KingLTD/Law_data | 2023-09-04T03:07:50.000Z | [
"region:us"
] | KingLTD | null | null | null | 0 | 0 | Entry not found |
YoungPhlo/dahoas-code_review_instruct_critique_revision_python_standardized | 2023-09-04T03:10:31.000Z | [
"region:us"
] | YoungPhlo | null | null | null | 0 | 0 | Entry not found |
FanChen0116/bus_few35_front | 2023-09-07T03:56:10.000Z | [
"region:us"
] | FanChen0116 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 6172
num_examples: 35
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 6900
num_examples: 35
download_size: 14148
dataset_size: 19972
---
# Dataset Card for "bus_few35_front"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
goodfellowliu/Set14 | 2023-09-04T06:11:28.000Z | [
"license:openrail",
"region:us"
] | goodfellowliu | null | null | null | 0 | 0 | ---
license: openrail
---
|
FanChen0116/bus_few35_front_empty | 2023-09-07T04:41:33.000Z | [
"region:us"
] | FanChen0116 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 5491
num_examples: 35
- name: validation
num_bytes: 6128
num_examples: 35
- name: test
num_bytes: 6900
num_examples: 35
download_size: 0
dataset_size: 18519
---
# Dataset Card for "bus_few35_front_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dalamar96/guanaco-llama2-1k | 2023-09-05T03:57:40.000Z | [
"region:us"
] | Dalamar96 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 0
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cireix/mvml | 2023-09-04T03:30:12.000Z | [
"region:us"
] | cireix | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yuekai/train_3.5M_CN_with_category_vicuna | 2023-09-04T07:57:59.000Z | [
"region:us"
] | yuekai | null | null | null | 0 | 0 | Entry not found |
wuming156/sdxl | 2023-10-07T04:56:27.000Z | [
"region:us"
] | wuming156 | null | null | null | 0 | 0 | Entry not found |
jijay/instructpix2pix-demo | 2023-09-05T08:03:31.000Z | [
"region:us"
] | jijay | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 4850304.0
num_examples: 4
download_size: 0
dataset_size: 4850304.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instructpix2pix-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jiiiM/NeMix-Style_v2.1.2 | 2023-09-06T00:06:49.000Z | [
"license:unknown",
"region:us"
] | jiiiM | null | null | null | 0 | 0 | ---
license: unknown
---
|
jijay/instructpix2pix-demov2 | 2023-09-04T07:37:14.000Z | [
"region:us"
] | jijay | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 4872706.0
num_examples: 4
download_size: 4873592
dataset_size: 4872706.0
---
# Dataset Card for "instructpix2pix-demov2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gyataro/cdacm-models | 2023-09-04T04:58:42.000Z | [
"license:gpl-3.0",
"region:us"
] | gyataro | null | null | null | 0 | 0 | ---
license: gpl-3.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.