id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
joey234/mmlu-high_school_european_history-neg-prepend-fix | 2023-08-21T07:35:30.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 35805
num_examples: 5
- name: test
num_bytes: 1243562
num_examples: 165
download_size: 66756
dataset_size: 1279367
---
# Dataset Card for "mmlu-high_school_european_history-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged | 2023-09-18T01:46:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of haonan-li/bactrian-x-llama-13b-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [haonan-li/bactrian-x-llama-13b-merged](https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T01:46:24.914160](https://huggingface.co/datasets/open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged/blob/main/results_2023-09-18T01-46-24.914160.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2553481543624161,\n\
\ \"em_stderr\": 0.004465629087714431,\n \"f1\": 0.31091652684563814,\n\
\ \"f1_stderr\": 0.004443751758152442,\n \"acc\": 0.3974435920159074,\n\
\ \"acc_stderr\": 0.009316527734088942\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2553481543624161,\n \"em_stderr\": 0.004465629087714431,\n\
\ \"f1\": 0.31091652684563814,\n \"f1_stderr\": 0.004443751758152442\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \
\ \"acc_stderr\": 0.006298221796179595\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998289\n\
\ }\n}\n```"
repo_url: https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T01_46_24.914160
path:
- '**/details_harness|drop|3_2023-09-18T01-46-24.914160.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T01-46-24.914160.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T01_46_24.914160
path:
- '**/details_harness|gsm8k|5_2023-09-18T01-46-24.914160.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T01-46-24.914160.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:16:59.483464.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:16:59.483464.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T01_46_24.914160
path:
- '**/details_harness|winogrande|5_2023-09-18T01-46-24.914160.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T01-46-24.914160.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_16_59.483464
path:
- results_2023-07-19T19:16:59.483464.parquet
- split: 2023_09_18T01_46_24.914160
path:
- results_2023-09-18T01-46-24.914160.parquet
- split: latest
path:
- results_2023-09-18T01-46-24.914160.parquet
---
# Dataset Card for Evaluation run of haonan-li/bactrian-x-llama-13b-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [haonan-li/bactrian-x-llama-13b-merged](https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T01:46:24.914160](https://huggingface.co/datasets/open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged/blob/main/results_2023-09-18T01-46-24.914160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2553481543624161,
"em_stderr": 0.004465629087714431,
"f1": 0.31091652684563814,
"f1_stderr": 0.004443751758152442,
"acc": 0.3974435920159074,
"acc_stderr": 0.009316527734088942
},
"harness|drop|3": {
"em": 0.2553481543624161,
"em_stderr": 0.004465629087714431,
"f1": 0.31091652684563814,
"f1_stderr": 0.004443751758152442
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179595
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998289
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_heegyu__LIMA2-7b-hf | 2023-08-27T12:39:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/LIMA2-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/LIMA2-7b-hf](https://huggingface.co/heegyu/LIMA2-7b-hf) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA2-7b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T10:35:20.569922](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-7b-hf/blob/main/results_2023-08-09T10%3A35%3A20.569922.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4365900366347548,\n\
\ \"acc_stderr\": 0.03520872253567896,\n \"acc_norm\": 0.44023336829712084,\n\
\ \"acc_norm_stderr\": 0.035193131307052546,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.447415487052591,\n\
\ \"mc2_stderr\": 0.015017114299541445\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995421\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6234813782115116,\n\
\ \"acc_stderr\": 0.0048352227940065195,\n \"acc_norm\": 0.80601473809998,\n\
\ \"acc_norm_stderr\": 0.003946093539722774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068663,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068663\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4838709677419355,\n \"acc_stderr\": 0.028429203176724555,\n \"\
acc_norm\": 0.4838709677419355,\n \"acc_norm_stderr\": 0.028429203176724555\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03902551007374449,\n\
\ \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03902551007374449\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5505050505050505,\n \"acc_stderr\": 0.035441324919479704,\n \"\
acc_norm\": 0.5505050505050505,\n \"acc_norm_stderr\": 0.035441324919479704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03214536859788639,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03214536859788639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257374,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257374\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5798165137614679,\n \"acc_stderr\": 0.0211624200482735,\n \"acc_norm\"\
: 0.5798165137614679,\n \"acc_norm_stderr\": 0.0211624200482735\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.48523206751054854,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.48523206751054854,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.484304932735426,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.484304932735426,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.04562951548180765,\n \"acc_norm\"\
: 0.512396694214876,\n \"acc_norm_stderr\": 0.04562951548180765\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.048657775704107696,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.048657775704107696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n\
\ \"acc_stderr\": 0.030782321577688173,\n \"acc_norm\": 0.6709401709401709,\n\
\ \"acc_norm_stderr\": 0.030782321577688173\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5734355044699873,\n\
\ \"acc_stderr\": 0.017686066975675662,\n \"acc_norm\": 0.5734355044699873,\n\
\ \"acc_norm_stderr\": 0.017686066975675662\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.45375722543352603,\n \"acc_stderr\": 0.02680372058320619,\n\
\ \"acc_norm\": 0.45375722543352603,\n \"acc_norm_stderr\": 0.02680372058320619\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4738562091503268,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4567901234567901,\n \"acc_stderr\": 0.027716661650194048,\n\
\ \"acc_norm\": 0.4567901234567901,\n \"acc_norm_stderr\": 0.027716661650194048\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32985658409387225,\n\
\ \"acc_stderr\": 0.012008129938540469,\n \"acc_norm\": 0.32985658409387225,\n\
\ \"acc_norm_stderr\": 0.012008129938540469\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39215686274509803,\n \"acc_stderr\": 0.01975172650876263,\n \
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.01975172650876263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380062,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380062\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n\
\ \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6432748538011696,\n\
\ \"acc_stderr\": 0.03674013002860954,\n \"acc_norm\": 0.6432748538011696,\n\
\ \"acc_norm_stderr\": 0.03674013002860954\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361002,\n\
\ \"mc2\": 0.447415487052591,\n \"mc2_stderr\": 0.015017114299541445\n\
\ }\n}\n```"
repo_url: https://huggingface.co/heegyu/LIMA2-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:35:20.569922.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:35:20.569922.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:35:20.569922.parquet'
- config_name: results
data_files:
- split: 2023_08_09T10_35_20.569922
path:
- results_2023-08-09T10:35:20.569922.parquet
- split: latest
path:
- results_2023-08-09T10:35:20.569922.parquet
---
# Dataset Card for Evaluation run of heegyu/LIMA2-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/LIMA2-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/LIMA2-7b-hf](https://huggingface.co/heegyu/LIMA2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA2-7b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T10:35:20.569922](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-7b-hf/blob/main/results_2023-08-09T10%3A35%3A20.569922.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4365900366347548,
"acc_stderr": 0.03520872253567896,
"acc_norm": 0.44023336829712084,
"acc_norm_stderr": 0.035193131307052546,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.447415487052591,
"mc2_stderr": 0.015017114299541445
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995421
},
"harness|hellaswag|10": {
"acc": 0.6234813782115116,
"acc_stderr": 0.0048352227940065195,
"acc_norm": 0.80601473809998,
"acc_norm_stderr": 0.003946093539722774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068663,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068663
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03902551007374449,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03902551007374449
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5505050505050505,
"acc_stderr": 0.035441324919479704,
"acc_norm": 0.5505050505050505,
"acc_norm_stderr": 0.035441324919479704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03214536859788639,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03214536859788639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257374,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257374
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5798165137614679,
"acc_stderr": 0.0211624200482735,
"acc_norm": 0.5798165137614679,
"acc_norm_stderr": 0.0211624200482735
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.48523206751054854,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.48523206751054854,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.484304932735426,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.484304932735426,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.048657775704107696,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.048657775704107696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.030782321577688173,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.030782321577688173
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5734355044699873,
"acc_stderr": 0.017686066975675662,
"acc_norm": 0.5734355044699873,
"acc_norm_stderr": 0.017686066975675662
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.45375722543352603,
"acc_stderr": 0.02680372058320619,
"acc_norm": 0.45375722543352603,
"acc_norm_stderr": 0.02680372058320619
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.027716661650194048,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.027716661650194048
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32985658409387225,
"acc_stderr": 0.012008129938540469,
"acc_norm": 0.32985658409387225,
"acc_norm_stderr": 0.012008129938540469
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.01975172650876263,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.01975172650876263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380062,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380062
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333335,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333335
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.447415487052591,
"mc2_stderr": 0.015017114299541445
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_geography-neg-prepend-fix | 2023-08-21T07:35:42.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5845
num_examples: 5
- name: test
num_bytes: 477448
num_examples: 198
download_size: 12727
dataset_size: 483293
---
# Dataset Card for "mmlu-high_school_geography-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_government_and_politics-neg-prepend-fix | 2023-08-21T07:35:54.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6650
num_examples: 5
- name: test
num_bytes: 592819
num_examples: 193
download_size: 13885
dataset_size: 599469
---
# Dataset Card for "mmlu-high_school_government_and_politics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_heegyu__LIMA2-13b-hf | 2023-08-27T12:39:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/LIMA2-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/LIMA2-13b-hf](https://huggingface.co/heegyu/LIMA2-13b-hf) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA2-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T15:19:08.555277](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-13b-hf/blob/main/results_2023-08-09T15%3A19%3A08.555277.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5340108909986633,\n\
\ \"acc_stderr\": 0.03482184932603818,\n \"acc_norm\": 0.5380333729843425,\n\
\ \"acc_norm_stderr\": 0.03479976629431792,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.4180556700707162,\n\
\ \"mc2_stderr\": 0.015019305822479715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5494880546075085,\n \"acc_stderr\": 0.014539646098471627,\n\
\ \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.014301752223279542\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n\
\ \"acc_stderr\": 0.004752158936871873,\n \"acc_norm\": 0.836885082652858,\n\
\ \"acc_norm_stderr\": 0.003687153940568797\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467512,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467512\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211214,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6193548387096774,\n \"acc_stderr\": 0.02762171783290703,\n \"\
acc_norm\": 0.6193548387096774,\n \"acc_norm_stderr\": 0.02762171783290703\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n \"acc_norm\"\
: 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.019453066609201597,\n \"\
acc_norm\": 0.710091743119266,\n \"acc_norm_stderr\": 0.019453066609201597\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n\
\ \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n\
\ \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n\
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.032737667254591575,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.032737667254591575\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.042943408452120926,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.042943408452120926\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470022,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470022\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700916,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700916\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7215836526181354,\n\
\ \"acc_stderr\": 0.016028295188992476,\n \"acc_norm\": 0.7215836526181354,\n\
\ \"acc_norm_stderr\": 0.016028295188992476\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.016062290671110466,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.016062290671110466\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124768,\n\
\ \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.0289473388516141,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.0289473388516141\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563664,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563664\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5212418300653595,\n \"acc_stderr\": 0.020209572388600234,\n \
\ \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.020209572388600234\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.4180556700707162,\n\
\ \"mc2_stderr\": 0.015019305822479715\n }\n}\n```"
repo_url: https://huggingface.co/heegyu/LIMA2-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:19:08.555277.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:19:08.555277.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:19:08.555277.parquet'
- config_name: results
data_files:
- split: 2023_08_09T15_19_08.555277
path:
- results_2023-08-09T15:19:08.555277.parquet
- split: latest
path:
- results_2023-08-09T15:19:08.555277.parquet
---
# Dataset Card for Evaluation run of heegyu/LIMA2-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/LIMA2-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/LIMA2-13b-hf](https://huggingface.co/heegyu/LIMA2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA2-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T15:19:08.555277](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-13b-hf/blob/main/results_2023-08-09T15%3A19%3A08.555277.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5340108909986633,
"acc_stderr": 0.03482184932603818,
"acc_norm": 0.5380333729843425,
"acc_norm_stderr": 0.03479976629431792,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.4180556700707162,
"mc2_stderr": 0.015019305822479715
},
"harness|arc:challenge|25": {
"acc": 0.5494880546075085,
"acc_stderr": 0.014539646098471627,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.014301752223279542
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871873,
"acc_norm": 0.836885082652858,
"acc_norm_stderr": 0.003687153940568797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467512,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467512
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211214,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.032737667254591575,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.032737667254591575
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.042943408452120926,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.042943408452120926
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470022,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470022
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700916,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700916
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7215836526181354,
"acc_stderr": 0.016028295188992476,
"acc_norm": 0.7215836526181354,
"acc_norm_stderr": 0.016028295188992476
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.016062290671110466,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.016062290671110466
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5895061728395061,
"acc_stderr": 0.027371350925124768,
"acc_norm": 0.5895061728395061,
"acc_norm_stderr": 0.027371350925124768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.0289473388516141,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.0289473388516141
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563664,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5212418300653595,
"acc_stderr": 0.020209572388600234,
"acc_norm": 0.5212418300653595,
"acc_norm_stderr": 0.020209572388600234
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.4180556700707162,
"mc2_stderr": 0.015019305822479715
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719 | 2023-08-27T12:39:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/RedTulu-Uncensored-3B-0719
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/RedTulu-Uncensored-3B-0719](https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T10:33:22.624051](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719/blob/main/results_2023-07-24T10%3A33%3A22.624051.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.30784805854528013,\n\
\ \"acc_stderr\": 0.03339358689483524,\n \"acc_norm\": 0.31077195277158554,\n\
\ \"acc_norm_stderr\": 0.03339358544463924,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474207,\n \"mc2\": 0.3759292134265843,\n\
\ \"mc2_stderr\": 0.014265211481187241\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3771331058020478,\n \"acc_stderr\": 0.01416336689619259,\n\
\ \"acc_norm\": 0.40017064846416384,\n \"acc_norm_stderr\": 0.014317197787809181\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.476000796654053,\n\
\ \"acc_stderr\": 0.004984030250507289,\n \"acc_norm\": 0.6254730133439554,\n\
\ \"acc_norm_stderr\": 0.004830113797327042\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.03981240543717861,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.03981240543717861\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039766,\n\
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039766\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3419354838709677,\n \"acc_stderr\": 0.02698528957655274,\n \"\
acc_norm\": 0.3419354838709677,\n \"acc_norm_stderr\": 0.02698528957655274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.03608541011573967,\n\
\ \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.03608541011573967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.37373737373737376,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.37373737373737376,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089116,\n\
\ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089116\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110943,\n\
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3688073394495413,\n \"acc_stderr\": 0.020686227560729548,\n \"\
acc_norm\": 0.3688073394495413,\n \"acc_norm_stderr\": 0.020686227560729548\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13452914798206278,\n\
\ \"acc_stderr\": 0.022901183761575582,\n \"acc_norm\": 0.13452914798206278,\n\
\ \"acc_norm_stderr\": 0.022901183761575582\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.044492703500683815,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.044492703500683815\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4017094017094017,\n\
\ \"acc_stderr\": 0.03211693751051621,\n \"acc_norm\": 0.4017094017094017,\n\
\ \"acc_norm_stderr\": 0.03211693751051621\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.30140485312899107,\n\
\ \"acc_stderr\": 0.016409091097268787,\n \"acc_norm\": 0.30140485312899107,\n\
\ \"acc_norm_stderr\": 0.016409091097268787\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508277,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3202614379084967,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.3202614379084967,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818788,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818788\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30864197530864196,\n \"acc_stderr\": 0.025702640260603753,\n\
\ \"acc_norm\": 0.30864197530864196,\n \"acc_norm_stderr\": 0.025702640260603753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.011285033165551281,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.011285033165551281\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016643,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016643\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.04494290866252088,\n\
\ \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.04494290866252088\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2979591836734694,\n\
\ \"acc_stderr\": 0.02927956741106567,\n \"acc_norm\": 0.2979591836734694,\n\
\ \"acc_norm_stderr\": 0.02927956741106567\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.31343283582089554,\n \"acc_stderr\": 0.03280188205348643,\n\
\ \"acc_norm\": 0.31343283582089554,\n \"acc_norm_stderr\": 0.03280188205348643\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n\
\ \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.03508771929824565,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.03508771929824565\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22399020807833536,\n \"mc1_stderr\": 0.014594964329474207,\n\
\ \"mc2\": 0.3759292134265843,\n \"mc2_stderr\": 0.014265211481187241\n\
\ }\n}\n```"
repo_url: https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:33:22.624051.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:33:22.624051.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:33:22.624051.parquet'
- config_name: results
data_files:
- split: 2023_07_24T10_33_22.624051
path:
- results_2023-07-24T10:33:22.624051.parquet
- split: latest
path:
- results_2023-07-24T10:33:22.624051.parquet
---
# Dataset Card for Evaluation run of heegyu/RedTulu-Uncensored-3B-0719
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/RedTulu-Uncensored-3B-0719](https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T10:33:22.624051](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719/blob/main/results_2023-07-24T10%3A33%3A22.624051.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.30784805854528013,
"acc_stderr": 0.03339358689483524,
"acc_norm": 0.31077195277158554,
"acc_norm_stderr": 0.03339358544463924,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474207,
"mc2": 0.3759292134265843,
"mc2_stderr": 0.014265211481187241
},
"harness|arc:challenge|25": {
"acc": 0.3771331058020478,
"acc_stderr": 0.01416336689619259,
"acc_norm": 0.40017064846416384,
"acc_norm_stderr": 0.014317197787809181
},
"harness|hellaswag|10": {
"acc": 0.476000796654053,
"acc_stderr": 0.004984030250507289,
"acc_norm": 0.6254730133439554,
"acc_norm_stderr": 0.004830113797327042
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.03981240543717861,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.03981240543717861
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039766,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039766
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3419354838709677,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.3419354838709677,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.03608541011573967,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.03608541011573967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.37373737373737376,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.37373737373737376,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3316062176165803,
"acc_stderr": 0.03397636541089116,
"acc_norm": 0.3316062176165803,
"acc_norm_stderr": 0.03397636541089116
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.024503472557110943,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.024503472557110943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3688073394495413,
"acc_stderr": 0.020686227560729548,
"acc_norm": 0.3688073394495413,
"acc_norm_stderr": 0.020686227560729548
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13452914798206278,
"acc_stderr": 0.022901183761575582,
"acc_norm": 0.13452914798206278,
"acc_norm_stderr": 0.022901183761575582
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.044492703500683815,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.044492703500683815
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4017094017094017,
"acc_stderr": 0.03211693751051621,
"acc_norm": 0.4017094017094017,
"acc_norm_stderr": 0.03211693751051621
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.30140485312899107,
"acc_stderr": 0.016409091097268787,
"acc_norm": 0.30140485312899107,
"acc_norm_stderr": 0.016409091097268787
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508277,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3202614379084967,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.3202614379084967,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818788,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818788
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30864197530864196,
"acc_stderr": 0.025702640260603753,
"acc_norm": 0.30864197530864196,
"acc_norm_stderr": 0.025702640260603753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.011285033165551281,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.011285033165551281
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016643,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016643
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2979591836734694,
"acc_stderr": 0.02927956741106567,
"acc_norm": 0.2979591836734694,
"acc_norm_stderr": 0.02927956741106567
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31343283582089554,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.31343283582089554,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474207,
"mc2": 0.3759292134265843,
"mc2_stderr": 0.014265211481187241
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_macroeconomics-neg-prepend-fix | 2023-08-21T07:36:06.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5403
num_examples: 5
- name: test
num_bytes: 994039
num_examples: 390
download_size: 12073
dataset_size: 999442
---
# Dataset Card for "mmlu-high_school_macroeconomics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719 | 2023-08-27T12:39:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/WizardVicuna-3B-0719
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/WizardVicuna-3B-0719](https://huggingface.co/heegyu/WizardVicuna-3B-0719)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-24T10:31:33.839492](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719/blob/main/results_2023-07-24T10%3A31%3A33.839492.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26086368697383394,\n\
\ \"acc_stderr\": 0.031716589495169124,\n \"acc_norm\": 0.2637909568529062,\n\
\ \"acc_norm_stderr\": 0.03171489713211313,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.01559475363200653,\n \"mc2\": 0.40705647715032217,\n\
\ \"mc2_stderr\": 0.014454038951136792\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3839590443686007,\n \"acc_stderr\": 0.01421244498065189,\n\
\ \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009121\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.504779924317865,\n\
\ \"acc_stderr\": 0.004989553396413096,\n \"acc_norm\": 0.6544513045210117,\n\
\ \"acc_norm_stderr\": 0.004745749538752329\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.03496101481191181,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.03496101481191181\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03724563619774632,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03724563619774632\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.22486772486772486,\n \"acc_stderr\": 0.021502096078229147,\n \"\
acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.021502096078229147\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243183,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243183\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.23225806451612904,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.15656565656565657,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.15656565656565657,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700304,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700304\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437295,\n \"\
acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3273542600896861,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.3273542600896861,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794089,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794089\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n\
\ \"acc_stderr\": 0.01611731816683228,\n \"acc_norm\": 0.2835249042145594,\n\
\ \"acc_norm_stderr\": 0.01611731816683228\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.02468531686725781,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.02468531686725781\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261427,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261427\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604611,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604611\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677048,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677048\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042114,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827061,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827061\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n\
\ \"acc_stderr\": 0.025206963154225392,\n \"acc_norm\": 0.19183673469387755,\n\
\ \"acc_norm_stderr\": 0.025206963154225392\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.2935323383084577,\n \"acc_stderr\": 0.03220024104534205,\n\
\ \"acc_norm\": 0.2935323383084577,\n \"acc_norm_stderr\": 0.03220024104534205\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n\
\ \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n\
\ \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n\
\ \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.01559475363200653,\n\
\ \"mc2\": 0.40705647715032217,\n \"mc2_stderr\": 0.014454038951136792\n\
\ }\n}\n```"
repo_url: https://huggingface.co/heegyu/WizardVicuna-3B-0719
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:31:33.839492.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:31:33.839492.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:31:33.839492.parquet'
- config_name: results
data_files:
- split: 2023_07_24T10_31_33.839492
path:
- results_2023-07-24T10:31:33.839492.parquet
- split: latest
path:
- results_2023-07-24T10:31:33.839492.parquet
---
# Dataset Card for Evaluation run of heegyu/WizardVicuna-3B-0719
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/WizardVicuna-3B-0719
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna-3B-0719](https://huggingface.co/heegyu/WizardVicuna-3B-0719) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-24T10:31:33.839492](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719/blob/main/results_2023-07-24T10%3A31%3A33.839492.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26086368697383394,
"acc_stderr": 0.031716589495169124,
"acc_norm": 0.2637909568529062,
"acc_norm_stderr": 0.03171489713211313,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200653,
"mc2": 0.40705647715032217,
"mc2_stderr": 0.014454038951136792
},
"harness|arc:challenge|25": {
"acc": 0.3839590443686007,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009121
},
"harness|hellaswag|10": {
"acc": 0.504779924317865,
"acc_stderr": 0.004989553396413096,
"acc_norm": 0.6544513045210117,
"acc_norm_stderr": 0.004745749538752329
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.03496101481191181,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.03496101481191181
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.021502096078229147,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.021502096078229147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243183,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243183
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.15656565656565657,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.15656565656565657,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700304,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700304
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560486,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844082,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437295,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3273542600896861,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.3273542600896861,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.01611731816683228,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.01611731816683228
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.02468531686725781,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.02468531686725781
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261427,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261427
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02428861946604611,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02428861946604611
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677048,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677048
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042114,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827061,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827061
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225392,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225392
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2935323383084577,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.2935323383084577,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200653,
"mc2": 0.40705647715032217,
"mc2_stderr": 0.014454038951136792
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf | 2023-08-27T12:39:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/WizardVicuna2-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/WizardVicuna2-13b-hf](https://huggingface.co/heegyu/WizardVicuna2-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T15:23:39.656390](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf/blob/main/results_2023-08-09T15%3A23%3A39.656390.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48710928697948025,\n\
\ \"acc_stderr\": 0.03511620404303202,\n \"acc_norm\": 0.49092431609298764,\n\
\ \"acc_norm_stderr\": 0.035100777843621186,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.4242681935948369,\n\
\ \"mc2_stderr\": 0.015013141678369209\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.01459348769493774,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5952997410874328,\n\
\ \"acc_stderr\": 0.00489830816721185,\n \"acc_norm\": 0.7913762198765186,\n\
\ \"acc_norm_stderr\": 0.004054944548370497\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.030709486992556538,\n\
\ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.030709486992556538\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n\
\ \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n\
\ \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806585,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056129,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056129\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38974358974358975,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.38974358974358975,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.032478490123081544,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.032478490123081544\n },\n \"harness|hendrycksTest-high_school_physics|5\"\
: {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n\
\ \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6440366972477064,\n \"acc_stderr\": 0.020528559278244214,\n \"\
acc_norm\": 0.6440366972477064,\n \"acc_norm_stderr\": 0.020528559278244214\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\"\
: 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6225490196078431,\n\
\ \"acc_stderr\": 0.03402272044340705,\n \"acc_norm\": 0.6225490196078431,\n\
\ \"acc_norm_stderr\": 0.03402272044340705\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610805,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978814,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978814\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n\
\ \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.0486577757041077,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.0486577757041077\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n\
\ \"acc_stderr\": 0.028447965476231022,\n \"acc_norm\": 0.7478632478632479,\n\
\ \"acc_norm_stderr\": 0.028447965476231022\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.01672372651234305,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.01672372651234305\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.015183844307206146,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.015183844307206146\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3774445893089961,\n\
\ \"acc_stderr\": 0.012380680911165814,\n \"acc_norm\": 0.3774445893089961,\n\
\ \"acc_norm_stderr\": 0.012380680911165814\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4950980392156863,\n \"acc_stderr\": 0.020226862710039463,\n \
\ \"acc_norm\": 0.4950980392156863,\n \"acc_norm_stderr\": 0.020226862710039463\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457923,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457923\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.4242681935948369,\n\
\ \"mc2_stderr\": 0.015013141678369209\n }\n}\n```"
repo_url: https://huggingface.co/heegyu/WizardVicuna2-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:23:39.656390.parquet'
- config_name: results
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- results_2023-08-09T15:23:39.656390.parquet
- split: latest
path:
- results_2023-08-09T15:23:39.656390.parquet
---
# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/WizardVicuna2-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna2-13b-hf](https://huggingface.co/heegyu/WizardVicuna2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T15:23:39.656390](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf/blob/main/results_2023-08-09T15%3A23%3A39.656390.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48710928697948025,
"acc_stderr": 0.03511620404303202,
"acc_norm": 0.49092431609298764,
"acc_norm_stderr": 0.035100777843621186,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.4242681935948369,
"mc2_stderr": 0.015013141678369209
},
"harness|arc:challenge|25": {
"acc": 0.5247440273037542,
"acc_stderr": 0.01459348769493774,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.5952997410874328,
"acc_stderr": 0.00489830816721185,
"acc_norm": 0.7913762198765186,
"acc_norm_stderr": 0.004054944548370497
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4679245283018868,
"acc_stderr": 0.030709486992556538,
"acc_norm": 0.4679245283018868,
"acc_norm_stderr": 0.030709486992556538
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056129,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056129
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38974358974358975,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.38974358974358975,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.032478490123081544,
"acc_norm": 0.5,
"acc_norm_stderr": 0.032478490123081544
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6440366972477064,
"acc_stderr": 0.020528559278244214,
"acc_norm": 0.6440366972477064,
"acc_norm_stderr": 0.020528559278244214
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340705,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340705
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610805,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978814,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978814
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.0486577757041077,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.0486577757041077
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.028447965476231022,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.028447965476231022
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.01672372651234305,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.01672372651234305
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206146,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206146
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3774445893089961,
"acc_stderr": 0.012380680911165814,
"acc_norm": 0.3774445893089961,
"acc_norm_stderr": 0.012380680911165814
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34191176470588236,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.34191176470588236,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4950980392156863,
"acc_stderr": 0.020226862710039463,
"acc_norm": 0.4950980392156863,
"acc_norm_stderr": 0.020226862710039463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.563265306122449,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.563265306122449,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.4242681935948369,
"mc2_stderr": 0.015013141678369209
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_mathematics-neg-prepend-fix | 2023-08-21T07:36:19.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6798
num_examples: 5
- name: test
num_bytes: 654905
num_examples: 270
download_size: 15082
dataset_size: 661703
---
# Dataset Card for "mmlu-high_school_mathematics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Writer__palmyra-base | 2023-08-28T20:42:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 119 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-base\"\
,\n\t\"original_mmlu_world_religions_5\",\n\tsplit=\"train\")\n```\n\n## Latest\
\ results\n\nThese are the [latest results from run 2023-08-28T20:42:00.075340](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-base/blob/main/results_2023-08-28T20%3A42%3A00.075340.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27537968563397885,\n\
\ \"acc_stderr\": 0.033055212496120936\n },\n \"original|mmlu:abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446\n },\n\
\ \"original|mmlu:anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \
\ \"acc_stderr\": 0.03633384414073461\n },\n \"original|mmlu:astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.038607315993160904\n\
\ },\n \"original|mmlu:business_ethics|5\": {\n \"acc\": 0.19,\n \
\ \"acc_stderr\": 0.03942772444036623\n },\n \"original|mmlu:clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857\n\
\ },\n \"original|mmlu:college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532\n },\n \"original|mmlu:college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236\n },\n\
\ \"original|mmlu:college_computer_science|5\": {\n \"acc\": 0.28,\n \
\ \"acc_stderr\": 0.045126085985421276\n },\n \"original|mmlu:college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316\n },\n\
\ \"original|mmlu:college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.03692820767264867\n },\n \"original|mmlu:college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177\n\
\ },\n \"original|mmlu:computer_security|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252605\n },\n \"original|mmlu:conceptual_physics|5\"\
: {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.02713634960242406\n\
\ },\n \"original|mmlu:econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344\n },\n \"original|mmlu:electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716\n\
\ },\n \"original|mmlu:elementary_mathematics|5\": {\n \"acc\": 0.291005291005291,\n\
\ \"acc_stderr\": 0.02339382650048487\n },\n \"original|mmlu:formal_logic|5\"\
: {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132\n\
\ },\n \"original|mmlu:global_facts|5\": {\n \"acc\": 0.35,\n \
\ \"acc_stderr\": 0.047937248544110196\n },\n \"original|mmlu:high_school_biology|5\"\
: {\n \"acc\": 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411894\n\
\ },\n \"original|mmlu:high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n\
\ \"acc_stderr\": 0.031447125816782426\n },\n \"original|mmlu:high_school_computer_science|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269\n },\n\
\ \"original|mmlu:high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.03401506715249039\n },\n \"original|mmlu:high_school_geography|5\"\
: {\n \"acc\": 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521\n\
\ },\n \"original|mmlu:high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.03469713791704371\n \
\ },\n \"original|mmlu:high_school_macroeconomics|5\": {\n \"acc\":\
\ 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357\n },\n \
\ \"original|mmlu:high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.027195934804085626\n },\n \"original|mmlu:high_school_microeconomics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845\n\
\ },\n \"original|mmlu:high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n\
\ \"acc_stderr\": 0.03684881521389023\n },\n \"original|mmlu:high_school_psychology|5\"\
: {\n \"acc\": 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027\n\
\ },\n \"original|mmlu:high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.03398110890294636\n },\n \"original|mmlu:high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154\n },\n\
\ \"original|mmlu:high_school_world_history|5\": {\n \"acc\": 0.2320675105485232,\n\
\ \"acc_stderr\": 0.027479744550808524\n },\n \"original|mmlu:human_aging|5\"\
: {\n \"acc\": 0.11659192825112108,\n \"acc_stderr\": 0.02153963981624447\n\
\ },\n \"original|mmlu:human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n\
\ \"acc_stderr\": 0.035477710041594626\n },\n \"original|mmlu:international_law|5\"\
: {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512\n\
\ },\n \"original|mmlu:jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.03826076324884863\n },\n \"original|mmlu:logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354\n\
\ },\n \"original|mmlu:machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.03562367850095391\n },\n \"original|mmlu:management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097173\n\
\ },\n \"original|mmlu:marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392926\n },\n \"original|mmlu:medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446\n },\n\
\ \"original|mmlu:miscellaneous|5\": {\n \"acc\": 0.2120051085568327,\n\
\ \"acc_stderr\": 0.014616099385833694\n },\n \"original|mmlu:moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967554\n\
\ },\n \"original|mmlu:moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588\n },\n \"original|mmlu:nutrition|5\"\
: {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.026004800363952113\n\
\ },\n \"original|mmlu:philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426\n },\n \"original|mmlu:prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262196\n\
\ },\n \"original|mmlu:professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n\
\ \"acc_stderr\": 0.026129572527180848\n },\n \"original|mmlu:professional_law|5\"\
: {\n \"acc\": 0.26988265971316816,\n \"acc_stderr\": 0.011337381084250402\n\
\ },\n \"original|mmlu:professional_medicine|5\": {\n \"acc\": 0.40808823529411764,\n\
\ \"acc_stderr\": 0.029855261393483927\n },\n \"original|mmlu:professional_psychology|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.016819028375736386\n\
\ },\n \"original|mmlu:public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.041723430387053825\n },\n \"original|mmlu:security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278985\n\
\ },\n \"original|mmlu:sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.031524391865554016\n },\n \"original|mmlu:us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816\n },\n\
\ \"original|mmlu:virology|5\": {\n \"acc\": 0.23493975903614459,\n \
\ \"acc_stderr\": 0.03300533186128922\n },\n \"original|mmlu:world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533197\n\
\ }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|arc:challenge|25_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hellaswag|10_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:49:48.066230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T12:49:48.066230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T12:49:48.066230.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:42:00.075340.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_42_00.075340
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:42:00.075340.parquet'
- config_name: results
data_files:
- split: 2023_07_19T12_49_48.066230
path:
- results_2023-07-19T12:49:48.066230.parquet
- split: 2023_08_28T20_42_00.075340
path:
- results_2023-08-28T20:42:00.075340.parquet
- split: latest
path:
- results_2023-08-28T20:42:00.075340.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 119 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-base",
"original_mmlu_world_religions_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T20:42:00.075340](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-base/blob/main/results_2023-08-28T20%3A42%3A00.075340.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27537968563397885,
"acc_stderr": 0.033055212496120936
},
"original|mmlu:abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446
},
"original|mmlu:anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461
},
"original|mmlu:astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.038607315993160904
},
"original|mmlu:business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623
},
"original|mmlu:clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857
},
"original|mmlu:college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532
},
"original|mmlu:college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236
},
"original|mmlu:college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276
},
"original|mmlu:college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316
},
"original|mmlu:college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.03692820767264867
},
"original|mmlu:college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177
},
"original|mmlu:computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605
},
"original|mmlu:conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.02713634960242406
},
"original|mmlu:econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344
},
"original|mmlu:electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716
},
"original|mmlu:elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048487
},
"original|mmlu:formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132
},
"original|mmlu:global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196
},
"original|mmlu:high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411894
},
"original|mmlu:high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782426
},
"original|mmlu:high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269
},
"original|mmlu:high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039
},
"original|mmlu:high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521
},
"original|mmlu:high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.03469713791704371
},
"original|mmlu:high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357
},
"original|mmlu:high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626
},
"original|mmlu:high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845
},
"original|mmlu:high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023
},
"original|mmlu:high_school_psychology|5": {
"acc": 0.3467889908256881,
"acc_stderr": 0.020406097104093027
},
"original|mmlu:high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636
},
"original|mmlu:high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154
},
"original|mmlu:high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.027479744550808524
},
"original|mmlu:human_aging|5": {
"acc": 0.11659192825112108,
"acc_stderr": 0.02153963981624447
},
"original|mmlu:human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594626
},
"original|mmlu:international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512
},
"original|mmlu:jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03826076324884863
},
"original|mmlu:logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354
},
"original|mmlu:machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.03562367850095391
},
"original|mmlu:management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097173
},
"original|mmlu:marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392926
},
"original|mmlu:medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446
},
"original|mmlu:miscellaneous|5": {
"acc": 0.2120051085568327,
"acc_stderr": 0.014616099385833694
},
"original|mmlu:moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967554
},
"original|mmlu:moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588
},
"original|mmlu:nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.026004800363952113
},
"original|mmlu:philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426
},
"original|mmlu:prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262196
},
"original|mmlu:professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848
},
"original|mmlu:professional_law|5": {
"acc": 0.26988265971316816,
"acc_stderr": 0.011337381084250402
},
"original|mmlu:professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483927
},
"original|mmlu:professional_psychology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.016819028375736386
},
"original|mmlu:public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825
},
"original|mmlu:security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278985
},
"original|mmlu:sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.031524391865554016
},
"original|mmlu:us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816
},
"original|mmlu:virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922
},
"original|mmlu:world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.030944459778533197
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Writer__camel-5b-hf | 2023-08-27T12:39:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Writer/camel-5b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/camel-5b-hf](https://huggingface.co/Writer/camel-5b-hf) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__camel-5b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T15:25:02.904083](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__camel-5b-hf/blob/main/results_2023-07-19T15%3A25%3A02.904083.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2645140424226894,\n\
\ \"acc_stderr\": 0.031848011837527175,\n \"acc_norm\": 0.26754200871505684,\n\
\ \"acc_norm_stderr\": 0.03185434599476783,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.40652329480691085,\n\
\ \"mc2_stderr\": 0.014792493882118896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.31399317406143346,\n \"acc_stderr\": 0.013562691224726286,\n\
\ \"acc_norm\": 0.3515358361774744,\n \"acc_norm_stderr\": 0.013952413699600943\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43507269468233417,\n\
\ \"acc_stderr\": 0.004947533158712096,\n \"acc_norm\": 0.5761800438159729,\n\
\ \"acc_norm_stderr\": 0.004931525961035755\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989569,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989569\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338005,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338005\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741543,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333338,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333338\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239952,\n \"\
acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239952\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.29064039408866993,\n \"acc_stderr\": 0.03194740072265541,\n \"\
acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.03194740072265541\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.24242424242424243,\n \"acc_stderr\": 0.03053289223393203,\n \"\
acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03053289223393203\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.02184086699042309,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.02184086699042309\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184406,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.01817511051034359,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.01817511051034359\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18055555555555555,\n \"acc_stderr\": 0.02623287897149166,\n \"\
acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.02623287897149166\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03068582059661079,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03068582059661079\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041017,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041017\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.03680350371286462,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.03680350371286462\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891165,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891165\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.31189710610932475,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886345,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886345\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953778,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953778\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27835723598435463,\n\
\ \"acc_stderr\": 0.011446990197380984,\n \"acc_norm\": 0.27835723598435463,\n\
\ \"acc_norm_stderr\": 0.011446990197380984\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.024723110407677055,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.024723110407677055\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.29850746268656714,\n\
\ \"acc_stderr\": 0.0323574378935504,\n \"acc_norm\": 0.29850746268656714,\n\
\ \"acc_norm_stderr\": 0.0323574378935504\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691582,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691582\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.40652329480691085,\n\
\ \"mc2_stderr\": 0.014792493882118896\n }\n}\n```"
repo_url: https://huggingface.co/Writer/camel-5b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:25:02.904083.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:25:02.904083.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:25:02.904083.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_25_02.904083
path:
- results_2023-07-19T15:25:02.904083.parquet
- split: latest
path:
- results_2023-07-19T15:25:02.904083.parquet
---
# Dataset Card for Evaluation run of Writer/camel-5b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/camel-5b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/camel-5b-hf](https://huggingface.co/Writer/camel-5b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__camel-5b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T15:25:02.904083](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__camel-5b-hf/blob/main/results_2023-07-19T15%3A25%3A02.904083.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2645140424226894,
"acc_stderr": 0.031848011837527175,
"acc_norm": 0.26754200871505684,
"acc_norm_stderr": 0.03185434599476783,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.40652329480691085,
"mc2_stderr": 0.014792493882118896
},
"harness|arc:challenge|25": {
"acc": 0.31399317406143346,
"acc_stderr": 0.013562691224726286,
"acc_norm": 0.3515358361774744,
"acc_norm_stderr": 0.013952413699600943
},
"harness|hellaswag|10": {
"acc": 0.43507269468233417,
"acc_stderr": 0.004947533158712096,
"acc_norm": 0.5761800438159729,
"acc_norm_stderr": 0.004931525961035755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989569,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989569
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338005,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338005
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741543,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333338,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333338
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239952,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.03194740072265541,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.03194740072265541
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03053289223393203,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03053289223393203
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.02184086699042309,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.02184086699042309
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184406,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.01817511051034359,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.01817511051034359
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.02623287897149166,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.02623287897149166
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03068582059661079,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03068582059661079
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041017,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041017
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.03680350371286462,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.03680350371286462
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891165,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197804,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197804
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31189710610932475,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.31189710610932475,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886345,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886345
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953778,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953778
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27835723598435463,
"acc_stderr": 0.011446990197380984,
"acc_norm": 0.27835723598435463,
"acc_norm_stderr": 0.011446990197380984
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.024723110407677055,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.024723110407677055
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.29850746268656714,
"acc_stderr": 0.0323574378935504,
"acc_norm": 0.29850746268656714,
"acc_norm_stderr": 0.0323574378935504
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691582,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691582
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.40652329480691085,
"mc2_stderr": 0.014792493882118896
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_microeconomics-neg-prepend-fix | 2023-08-21T07:36:31.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5450
num_examples: 5
- name: test
num_bytes: 608302
num_examples: 238
download_size: 12377
dataset_size: 613752
---
# Dataset Card for "mmlu-high_school_microeconomics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ariellee__SuperPlatty-30B | 2023-09-18T02:07:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ariellee/SuperPlatty-30B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ariellee/SuperPlatty-30B](https://huggingface.co/ariellee/SuperPlatty-30B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ariellee__SuperPlatty-30B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T02:07:35.126517](https://huggingface.co/datasets/open-llm-leaderboard/details_ariellee__SuperPlatty-30B/blob/main/results_2023-09-18T02-07-35.126517.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4521812080536913,\n\
\ \"em_stderr\": 0.0050969968963073785,\n \"f1\": 0.4944075083892625,\n\
\ \"f1_stderr\": 0.004926147134745944,\n \"acc\": 0.44987891738317937,\n\
\ \"acc_stderr\": 0.009646692360892724\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4521812080536913,\n \"em_stderr\": 0.0050969968963073785,\n\
\ \"f1\": 0.4944075083892625,\n \"f1_stderr\": 0.004926147134745944\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09628506444275967,\n \
\ \"acc_stderr\": 0.008125264128215882\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569567\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ariellee/SuperPlatty-30B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T02_07_35.126517
path:
- '**/details_harness|drop|3_2023-09-18T02-07-35.126517.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T02-07-35.126517.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T02_07_35.126517
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-07-35.126517.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-07-35.126517.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:30:48.838844.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:30:48.838844.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T02_07_35.126517
path:
- '**/details_harness|winogrande|5_2023-09-18T02-07-35.126517.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T02-07-35.126517.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_30_48.838844
path:
- results_2023-07-19T22:30:48.838844.parquet
- split: 2023_09_18T02_07_35.126517
path:
- results_2023-09-18T02-07-35.126517.parquet
- split: latest
path:
- results_2023-09-18T02-07-35.126517.parquet
---
# Dataset Card for Evaluation run of ariellee/SuperPlatty-30B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ariellee/SuperPlatty-30B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ariellee/SuperPlatty-30B](https://huggingface.co/ariellee/SuperPlatty-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ariellee__SuperPlatty-30B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T02:07:35.126517](https://huggingface.co/datasets/open-llm-leaderboard/details_ariellee__SuperPlatty-30B/blob/main/results_2023-09-18T02-07-35.126517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4521812080536913,
"em_stderr": 0.0050969968963073785,
"f1": 0.4944075083892625,
"f1_stderr": 0.004926147134745944,
"acc": 0.44987891738317937,
"acc_stderr": 0.009646692360892724
},
"harness|drop|3": {
"em": 0.4521812080536913,
"em_stderr": 0.0050969968963073785,
"f1": 0.4944075083892625,
"f1_stderr": 0.004926147134745944
},
"harness|gsm8k|5": {
"acc": 0.09628506444275967,
"acc_stderr": 0.008125264128215882
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569567
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_physics-neg-prepend-fix | 2023-08-21T07:36:44.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6996
num_examples: 5
- name: test
num_bytes: 466852
num_examples: 151
download_size: 14956
dataset_size: 473848
---
# Dataset Card for "mmlu-high_school_physics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_roneneldan__TinyStories-33M | 2023-09-23T05:35:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of roneneldan/TinyStories-33M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-33M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T05:35:11.802678](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-33M/blob/main/results_2023-09-23T05-35-11.802678.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0003145973154362416,\n\
\ \"em_stderr\": 0.0001816137946884096,\n \"f1\": 0.001937919463087248,\n\
\ \"f1_stderr\": 0.0003031702602652814,\n \"acc\": 0.24546172059984214,\n\
\ \"acc_stderr\": 0.007025085047248846\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0003145973154362416,\n \"em_stderr\": 0.0001816137946884096,\n\
\ \"f1\": 0.001937919463087248,\n \"f1_stderr\": 0.0003031702602652814\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4909234411996843,\n\
\ \"acc_stderr\": 0.014050170094497692\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-33M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T05_35_11.802678
path:
- '**/details_harness|drop|3_2023-09-23T05-35-11.802678.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T05-35-11.802678.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T05_35_11.802678
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-35-11.802678.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-35-11.802678.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T05_35_11.802678
path:
- '**/details_harness|winogrande|5_2023-09-23T05-35-11.802678.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T05-35-11.802678.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- results_2023-07-19T13:32:19.766363.parquet
- split: 2023_09_23T05_35_11.802678
path:
- results_2023-09-23T05-35-11.802678.parquet
- split: latest
path:
- results_2023-09-23T05-35-11.802678.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-33M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-33M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-33M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T05:35:11.802678](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-33M/blob/main/results_2023-09-23T05-35-11.802678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0003145973154362416,
"em_stderr": 0.0001816137946884096,
"f1": 0.001937919463087248,
"f1_stderr": 0.0003031702602652814,
"acc": 0.24546172059984214,
"acc_stderr": 0.007025085047248846
},
"harness|drop|3": {
"em": 0.0003145973154362416,
"em_stderr": 0.0001816137946884096,
"f1": 0.001937919463087248,
"f1_stderr": 0.0003031702602652814
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4909234411996843,
"acc_stderr": 0.014050170094497692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_roneneldan__TinyStories-3M | 2023-08-27T12:39:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of roneneldan/TinyStories-3M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-3M](https://huggingface.co/roneneldan/TinyStories-3M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-3M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T13:26:26.672547](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-3M/blob/main/results_2023-07-19T13%3A26%3A26.672547.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.249047843049329,\n\
\ \"acc_stderr\": 0.0315170725080696,\n \"acc_norm\": 0.24949638723109935,\n\
\ \"acc_norm_stderr\": 0.03152705750100707,\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059685,\n \"mc2\": 0.47326829150220334,\n\
\ \"mc2_stderr\": 0.016452140961048557\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19197952218430034,\n \"acc_stderr\": 0.011509598906598095,\n\
\ \"acc_norm\": 0.22013651877133106,\n \"acc_norm_stderr\": 0.012108124883460983\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2575184226249751,\n\
\ \"acc_stderr\": 0.00436373641068962,\n \"acc_norm\": 0.25582553276239794,\n\
\ \"acc_norm_stderr\": 0.004354325017137536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741713,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185554,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860667,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860667\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722724,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722724\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.02453632602613422,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.02453632602613422\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.03957835471980978,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.03957835471980978\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094476,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094476\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2784163473818646,\n\
\ \"acc_stderr\": 0.01602829518899246,\n \"acc_norm\": 0.2784163473818646,\n\
\ \"acc_norm_stderr\": 0.01602829518899246\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967537,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967537\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.014125968754673403,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.014125968754673403\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.023420375478296132,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.023420375478296132\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.0239935017090421,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.0239935017090421\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469414,\n \
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469414\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23663624511082137,\n\
\ \"acc_stderr\": 0.010855137351572744,\n \"acc_norm\": 0.23663624511082137,\n\
\ \"acc_norm_stderr\": 0.010855137351572744\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059685,\n \"mc2\": 0.47326829150220334,\n\
\ \"mc2_stderr\": 0.016452140961048557\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-3M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:26:26.672547.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:26:26.672547.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:26:26.672547.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_26_26.672547
path:
- results_2023-07-19T13:26:26.672547.parquet
- split: latest
path:
- results_2023-07-19T13:26:26.672547.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-3M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-3M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-3M](https://huggingface.co/roneneldan/TinyStories-3M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-3M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T13:26:26.672547](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-3M/blob/main/results_2023-07-19T13%3A26%3A26.672547.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.249047843049329,
"acc_stderr": 0.0315170725080696,
"acc_norm": 0.24949638723109935,
"acc_norm_stderr": 0.03152705750100707,
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059685,
"mc2": 0.47326829150220334,
"mc2_stderr": 0.016452140961048557
},
"harness|arc:challenge|25": {
"acc": 0.19197952218430034,
"acc_stderr": 0.011509598906598095,
"acc_norm": 0.22013651877133106,
"acc_norm_stderr": 0.012108124883460983
},
"harness|hellaswag|10": {
"acc": 0.2575184226249751,
"acc_stderr": 0.00436373641068962,
"acc_norm": 0.25582553276239794,
"acc_norm_stderr": 0.004354325017137536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741713,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185554,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860667,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860667
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722724,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722724
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.02453632602613422,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.02453632602613422
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980978,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980978
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094476,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094476
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2784163473818646,
"acc_stderr": 0.01602829518899246,
"acc_norm": 0.2784163473818646,
"acc_norm_stderr": 0.01602829518899246
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967537,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967537
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.014125968754673403,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.014125968754673403
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.023420375478296132,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.023420375478296132
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.0239935017090421,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.0239935017090421
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.027281608344469414,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.027281608344469414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23663624511082137,
"acc_stderr": 0.010855137351572744,
"acc_norm": 0.23663624511082137,
"acc_norm_stderr": 0.010855137351572744
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059685,
"mc2": 0.47326829150220334,
"mc2_stderr": 0.016452140961048557
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_psychology-neg-prepend-fix | 2023-08-21T07:36:56.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7683
num_examples: 5
- name: test
num_bytes: 1743191
num_examples: 545
download_size: 18091
dataset_size: 1750874
---
# Dataset Card for "mmlu-high_school_psychology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_roneneldan__TinyStories-8M | 2023-09-22T21:54:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of roneneldan/TinyStories-8M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-8M](https://huggingface.co/roneneldan/TinyStories-8M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-8M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T21:54:42.059067](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-8M/blob/main/results_2023-09-22T21-54-42.059067.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00010486577181208053,\n\
\ \"em_stderr\": 0.0001048657718120815,\n \"f1\": 0.0030820050335570444,\n\
\ \"f1_stderr\": 0.00023396683820156773,\n \"acc\": 0.2513812154696133,\n\
\ \"acc_stderr\": 0.007026135605808221\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.0001048657718120815,\n\
\ \"f1\": 0.0030820050335570444,\n \"f1_stderr\": 0.00023396683820156773\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5027624309392266,\n\
\ \"acc_stderr\": 0.014052271211616441\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-8M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T21_54_42.059067
path:
- '**/details_harness|drop|3_2023-09-22T21-54-42.059067.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T21-54-42.059067.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T21_54_42.059067
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-54-42.059067.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-54-42.059067.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:29:12.033365.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:29:12.033365.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T21_54_42.059067
path:
- '**/details_harness|winogrande|5_2023-09-22T21-54-42.059067.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T21-54-42.059067.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_29_12.033365
path:
- results_2023-07-19T13:29:12.033365.parquet
- split: 2023_09_22T21_54_42.059067
path:
- results_2023-09-22T21-54-42.059067.parquet
- split: latest
path:
- results_2023-09-22T21-54-42.059067.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-8M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-8M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-8M](https://huggingface.co/roneneldan/TinyStories-8M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-8M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T21:54:42.059067](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-8M/blob/main/results_2023-09-22T21-54-42.059067.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00010486577181208053,
"em_stderr": 0.0001048657718120815,
"f1": 0.0030820050335570444,
"f1_stderr": 0.00023396683820156773,
"acc": 0.2513812154696133,
"acc_stderr": 0.007026135605808221
},
"harness|drop|3": {
"em": 0.00010486577181208053,
"em_stderr": 0.0001048657718120815,
"f1": 0.0030820050335570444,
"f1_stderr": 0.00023396683820156773
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_statistics-neg-prepend-fix | 2023-08-21T07:37:09.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 9060
num_examples: 5
- name: test
num_bytes: 779208
num_examples: 216
download_size: 18867
dataset_size: 788268
---
# Dataset Card for "mmlu-high_school_statistics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_roneneldan__TinyStories-1M | 2023-09-22T21:41:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of roneneldan/TinyStories-1M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-1M](https://huggingface.co/roneneldan/TinyStories-1M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-1M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T21:41:24.294253](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-1M/blob/main/results_2023-09-22T21-41-24.294253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00020973154362416107,\n\
\ \"em_stderr\": 0.00014829481977282063,\n \"f1\": 0.003178481543624158,\n\
\ \"f1_stderr\": 0.0002730192207643319,\n \"acc\": 0.26085240726124703,\n\
\ \"acc_stderr\": 0.007019619608242314\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00020973154362416107,\n \"em_stderr\": 0.00014829481977282063,\n\
\ \"f1\": 0.003178481543624158,\n \"f1_stderr\": 0.0002730192207643319\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n\
\ \"acc_stderr\": 0.014039239216484627\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-1M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T21_41_24.294253
path:
- '**/details_harness|drop|3_2023-09-22T21-41-24.294253.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T21-41-24.294253.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T21_41_24.294253
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-41-24.294253.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-41-24.294253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:25:02.593147.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:25:02.593147.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T21_41_24.294253
path:
- '**/details_harness|winogrande|5_2023-09-22T21-41-24.294253.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T21-41-24.294253.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_25_02.593147
path:
- results_2023-07-19T13:25:02.593147.parquet
- split: 2023_09_22T21_41_24.294253
path:
- results_2023-09-22T21-41-24.294253.parquet
- split: latest
path:
- results_2023-09-22T21-41-24.294253.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-1M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-1M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-1M](https://huggingface.co/roneneldan/TinyStories-1M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-1M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T21:41:24.294253](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-1M/blob/main/results_2023-09-22T21-41-24.294253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977282063,
"f1": 0.003178481543624158,
"f1_stderr": 0.0002730192207643319,
"acc": 0.26085240726124703,
"acc_stderr": 0.007019619608242314
},
"harness|drop|3": {
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977282063,
"f1": 0.003178481543624158,
"f1_stderr": 0.0002730192207643319
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.014039239216484627
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_roneneldan__TinyStories-28M | 2023-08-27T12:39:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of roneneldan/TinyStories-28M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-28M](https://huggingface.co/roneneldan/TinyStories-28M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-28M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T13:32:08.084027](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-28M/blob/main/results_2023-07-19T13%3A32%3A08.084027.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23484320425075647,\n\
\ \"acc_stderr\": 0.0307939466207197,\n \"acc_norm\": 0.2355754587984318,\n\
\ \"acc_norm_stderr\": 0.03080909034206086,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023498,\n \"mc2\": 0.4808347610689909,\n\
\ \"mc2_stderr\": 0.01657815321723807\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18600682593856654,\n \"acc_stderr\": 0.011370940183266738,\n\
\ \"acc_norm\": 0.22781569965870307,\n \"acc_norm_stderr\": 0.012256708602326905\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25692093208524197,\n\
\ \"acc_stderr\": 0.004360424536145123,\n \"acc_norm\": 0.2583150766779526,\n\
\ \"acc_norm_stderr\": 0.004368135676213557\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.12,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.12,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.033550453048829226,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.033550453048829226\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118355,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118355\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\
: 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641145,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641145\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727773,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727773\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275444,\n \"\
acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031086,\n \"\
acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031086\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"\
acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29292929292929293,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909906,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909906\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212801,\n \
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212801\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838056,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838056\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25871559633027524,\n \"acc_stderr\": 0.01877605231961962,\n \"\
acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.01877605231961962\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598014,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598014\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\
\ \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.2556053811659193,\n\
\ \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.0384487613978527,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.0384487613978527\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.18404907975460122,\n \"acc_stderr\": 0.03044677768797174,\n\
\ \"acc_norm\": 0.18404907975460122,\n \"acc_norm_stderr\": 0.03044677768797174\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.027236013946196694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.027236013946196694\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841285,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841285\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677048,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677048\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307717,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307717\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045496,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023498,\n \"mc2\": 0.4808347610689909,\n\
\ \"mc2_stderr\": 0.01657815321723807\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-28M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:08.084027.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:08.084027.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:08.084027.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_32_08.084027
path:
- results_2023-07-19T13:32:08.084027.parquet
- split: latest
path:
- results_2023-07-19T13:32:08.084027.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-28M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-28M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-28M](https://huggingface.co/roneneldan/TinyStories-28M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-28M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T13:32:08.084027](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-28M/blob/main/results_2023-07-19T13%3A32%3A08.084027.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23484320425075647,
"acc_stderr": 0.0307939466207197,
"acc_norm": 0.2355754587984318,
"acc_norm_stderr": 0.03080909034206086,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023498,
"mc2": 0.4808347610689909,
"mc2_stderr": 0.01657815321723807
},
"harness|arc:challenge|25": {
"acc": 0.18600682593856654,
"acc_stderr": 0.011370940183266738,
"acc_norm": 0.22781569965870307,
"acc_norm_stderr": 0.012256708602326905
},
"harness|hellaswag|10": {
"acc": 0.25692093208524197,
"acc_stderr": 0.004360424536145123,
"acc_norm": 0.2583150766779526,
"acc_norm_stderr": 0.004368135676213557
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.12,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.12,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.033550453048829226,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.033550453048829226
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118355,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118355
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641145,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641145
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727773,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727773
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02113285918275444,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02113285918275444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909906,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909906
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212801,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212801
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.02788682807838056,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.02788682807838056
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.01877605231961962,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.01877605231961962
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598014,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598014
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969927,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969927
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.0384487613978527,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.0384487613978527
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.18404907975460122,
"acc_stderr": 0.03044677768797174,
"acc_norm": 0.18404907975460122,
"acc_norm_stderr": 0.03044677768797174
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.027236013946196694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.027236013946196694
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841285,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841285
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677048,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677048
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307717,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307717
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045496,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023498,
"mc2": 0.4808347610689909,
"mc2_stderr": 0.01657815321723807
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_us_history-neg-prepend-fix | 2023-08-21T07:37:22.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 28257
num_examples: 5
- name: test
num_bytes: 1258154
num_examples: 204
download_size: 55081
dataset_size: 1286411
---
# Dataset Card for "mmlu-high_school_us_history-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openbmb__UltraLM-65b | 2023-09-23T05:14:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of openbmb/UltraLM-65b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openbmb/UltraLM-65b](https://huggingface.co/openbmb/UltraLM-65b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraLM-65b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T05:14:21.286059](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-65b/blob/main/results_2023-09-23T05-14-21.286059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23804530201342283,\n\
\ \"em_stderr\": 0.004361481495925771,\n \"f1\": 0.2999853187919465,\n\
\ \"f1_stderr\": 0.004304795126990332,\n \"acc\": 0.5694431396390439,\n\
\ \"acc_stderr\": 0.011961137264223144\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.23804530201342283,\n \"em_stderr\": 0.004361481495925771,\n\
\ \"f1\": 0.2999853187919465,\n \"f1_stderr\": 0.004304795126990332\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32752084912812734,\n \
\ \"acc_stderr\": 0.012927102210426474\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019811\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openbmb/UltraLM-65b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|arc:challenge|25_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T23_27_44.207127
path:
- '**/details_harness|drop|3_2023-09-18T23-27-44.207127.parquet'
- split: 2023_09_23T05_14_21.286059
path:
- '**/details_harness|drop|3_2023-09-23T05-14-21.286059.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T05-14-21.286059.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T23_27_44.207127
path:
- '**/details_harness|gsm8k|5_2023-09-18T23-27-44.207127.parquet'
- split: 2023_09_23T05_14_21.286059
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-14-21.286059.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-14-21.286059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hellaswag|10_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-04T22:09:07.792369.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-04T22:09:07.792369.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T23_27_44.207127
path:
- '**/details_harness|winogrande|5_2023-09-18T23-27-44.207127.parquet'
- split: 2023_09_23T05_14_21.286059
path:
- '**/details_harness|winogrande|5_2023-09-23T05-14-21.286059.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T05-14-21.286059.parquet'
- config_name: results
data_files:
- split: 2023_08_04T22_09_07.792369
path:
- results_2023-08-04T22:09:07.792369.parquet
- split: 2023_09_18T23_27_44.207127
path:
- results_2023-09-18T23-27-44.207127.parquet
- split: 2023_09_23T05_14_21.286059
path:
- results_2023-09-23T05-14-21.286059.parquet
- split: latest
path:
- results_2023-09-23T05-14-21.286059.parquet
---
# Dataset Card for Evaluation run of openbmb/UltraLM-65b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openbmb/UltraLM-65b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openbmb/UltraLM-65b](https://huggingface.co/openbmb/UltraLM-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openbmb__UltraLM-65b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T05:14:21.286059](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-65b/blob/main/results_2023-09-23T05-14-21.286059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23804530201342283,
"em_stderr": 0.004361481495925771,
"f1": 0.2999853187919465,
"f1_stderr": 0.004304795126990332,
"acc": 0.5694431396390439,
"acc_stderr": 0.011961137264223144
},
"harness|drop|3": {
"em": 0.23804530201342283,
"em_stderr": 0.004361481495925771,
"f1": 0.2999853187919465,
"f1_stderr": 0.004304795126990332
},
"harness|gsm8k|5": {
"acc": 0.32752084912812734,
"acc_stderr": 0.012927102210426474
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019811
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG | 2023-09-16T19:50:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xhyi/PT_GPTNEO350_ATG
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xhyi/PT_GPTNEO350_ATG](https://huggingface.co/xhyi/PT_GPTNEO350_ATG) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T19:50:14.065023](https://huggingface.co/datasets/open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG/blob/main/results_2023-09-16T19-50-14.065023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n\
\ \"em_stderr\": 0.0002344378046483565,\n \"f1\": 0.036350671140939664,\n\
\ \"f1_stderr\": 0.001029772885671985,\n \"acc\": 0.259575160680552,\n\
\ \"acc_stderr\": 0.007950023713639726\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.0002344378046483565,\n\
\ \"f1\": 0.036350671140939664,\n \"f1_stderr\": 0.001029772885671985\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036202\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.014046492383275832\n\
\ }\n}\n```"
repo_url: https://huggingface.co/xhyi/PT_GPTNEO350_ATG
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|arc:challenge|25_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T19_50_14.065023
path:
- '**/details_harness|drop|3_2023-09-16T19-50-14.065023.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T19-50-14.065023.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T19_50_14.065023
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-50-14.065023.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-50-14.065023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hellaswag|10_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T11:43:22.024559.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T11:43:22.024559.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T19_50_14.065023
path:
- '**/details_harness|winogrande|5_2023-09-16T19-50-14.065023.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T19-50-14.065023.parquet'
- config_name: results
data_files:
- split: 2023_07_19T11_43_22.024559
path:
- results_2023-07-19T11:43:22.024559.parquet
- split: 2023_09_16T19_50_14.065023
path:
- results_2023-09-16T19-50-14.065023.parquet
- split: latest
path:
- results_2023-09-16T19-50-14.065023.parquet
---
# Dataset Card for Evaluation run of xhyi/PT_GPTNEO350_ATG
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xhyi/PT_GPTNEO350_ATG
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xhyi/PT_GPTNEO350_ATG](https://huggingface.co/xhyi/PT_GPTNEO350_ATG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T19:50:14.065023](https://huggingface.co/datasets/open-llm-leaderboard/details_xhyi__PT_GPTNEO350_ATG/blob/main/results_2023-09-16T19-50-14.065023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0005243288590604027,
"em_stderr": 0.0002344378046483565,
"f1": 0.036350671140939664,
"f1_stderr": 0.001029772885671985,
"acc": 0.259575160680552,
"acc_stderr": 0.007950023713639726
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.0002344378046483565,
"f1": 0.036350671140939664,
"f1_stderr": 0.001029772885671985
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036202
},
"harness|winogrande|5": {
"acc": 0.5146014206787688,
"acc_stderr": 0.014046492383275832
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_world_history-neg-prepend-fix | 2023-08-21T07:37:34.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 17052
num_examples: 5
- name: test
num_bytes: 1694250
num_examples: 237
download_size: 31580
dataset_size: 1711302
---
# Dataset Card for "mmlu-high_school_world_history-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-human_aging-neg-prepend-fix | 2023-08-21T07:37:46.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5130
num_examples: 5
- name: test
num_bytes: 444054
num_examples: 223
download_size: 12237
dataset_size: 449184
---
# Dataset Card for "mmlu-human_aging-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-human_sexuality-neg-prepend-fix | 2023-08-21T07:37:58.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5197
num_examples: 5
- name: test
num_bytes: 288820
num_examples: 131
download_size: 13461
dataset_size: 294017
---
# Dataset Card for "mmlu-human_sexuality-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-international_law-neg-prepend-fix | 2023-08-21T07:38:11.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7660
num_examples: 5
- name: test
num_bytes: 467360
num_examples: 121
download_size: 15537
dataset_size: 475020
---
# Dataset Card for "mmlu-international_law-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-jurisprudence-neg-prepend-fix | 2023-08-21T07:38:23.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5970
num_examples: 5
- name: test
num_bytes: 297404
num_examples: 108
download_size: 12787
dataset_size: 303374
---
# Dataset Card for "mmlu-jurisprudence-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-logical_fallacies-neg-prepend-fix | 2023-08-21T07:38:36.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6380
num_examples: 5
- name: test
num_bytes: 460595
num_examples: 163
download_size: 13153
dataset_size: 466975
---
# Dataset Card for "mmlu-logical_fallacies-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-machine_learning-neg-prepend-fix | 2023-08-21T07:38:47.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 8687
num_examples: 5
- name: test
num_bytes: 385570
num_examples: 112
download_size: 18663
dataset_size: 394257
---
# Dataset Card for "mmlu-machine_learning-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-management-neg-prepend-fix | 2023-08-21T07:38:59.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4679
num_examples: 5
- name: test
num_bytes: 189715
num_examples: 103
download_size: 11704
dataset_size: 194394
---
# Dataset Card for "mmlu-management-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-marketing-neg-prepend-fix | 2023-08-21T07:39:12.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6193
num_examples: 5
- name: test
num_bytes: 632204
num_examples: 234
download_size: 14819
dataset_size: 638397
---
# Dataset Card for "mmlu-marketing-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-medical_genetics-neg-prepend-fix | 2023-08-21T07:39:24.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4785
num_examples: 5
- name: test
num_bytes: 208171
num_examples: 100
download_size: 11706
dataset_size: 212956
---
# Dataset Card for "mmlu-medical_genetics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-miscellaneous-neg-prepend-fix | 2023-08-21T07:39:36.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4153
num_examples: 5
- name: test
num_bytes: 1302583
num_examples: 783
download_size: 10773
dataset_size: 1306736
---
# Dataset Card for "mmlu-miscellaneous-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-moral_disputes-neg-prepend-fix | 2023-08-21T07:39:49.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6648
num_examples: 5
- name: test
num_bytes: 1035772
num_examples: 346
download_size: 12931
dataset_size: 1042420
---
# Dataset Card for "mmlu-moral_disputes-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-moral_scenarios-neg-prepend-fix | 2023-08-21T07:40:01.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 9395
num_examples: 5
- name: test
num_bytes: 3529743
num_examples: 895
download_size: 18412
dataset_size: 3539138
---
# Dataset Card for "mmlu-moral_scenarios-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-nutrition-neg-prepend-fix | 2023-08-21T07:40:14.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7518
num_examples: 5
- name: test
num_bytes: 1003777
num_examples: 306
download_size: 16915
dataset_size: 1011295
---
# Dataset Card for "mmlu-nutrition-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-philosophy-neg-prepend-fix | 2023-08-21T07:40:28.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4981
num_examples: 5
- name: test
num_bytes: 647230
num_examples: 311
download_size: 12766
dataset_size: 652211
---
# Dataset Card for "mmlu-philosophy-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-prehistory-neg-prepend-fix | 2023-08-21T07:40:40.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6833
num_examples: 5
- name: test
num_bytes: 1008140
num_examples: 324
download_size: 14881
dataset_size: 1014973
---
# Dataset Card for "mmlu-prehistory-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_accounting-neg-prepend-fix | 2023-08-21T07:40:52.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 9137
num_examples: 5
- name: test
num_bytes: 837793
num_examples: 282
download_size: 16120
dataset_size: 846930
---
# Dataset Card for "mmlu-professional_accounting-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_law-neg-prepend-fix | 2023-08-21T07:41:04.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 21027
num_examples: 5
- name: test
num_bytes: 11054114
num_examples: 1534
download_size: 49326
dataset_size: 11075141
---
# Dataset Card for "mmlu-professional_law-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
uscobraxgummies/CobraXGummies | 2023-08-18T12:09:46.000Z | [
"region:us"
] | uscobraxgummies | null | null | null | 0 | 0 | [CobraX Gummies](https://usa-cobrax-gummies.jimdosite.com/) are a revolutionary male health supplement that has helped more than thousands of users to get better and enhanced male health. Manufactured using the best ingredients and the most potent ingredients that have been scientifically proven to show positive results on male health, these CBD gummies are one of the most reliable options.
No matter what factor of male health you are struggling with, whether it is infidelity, inferiority complexes, or dissatisfaction, [CobraX Gummies](https://devfolio.co/@uscobraxgummies) have got you all covered. With its potent blend based on advanced cutting-edge science, the supplement helps enhance not only the functioning of your reproductive organs but also your overall physical performance.
As per many of the **[CobraX Gummies](https://www.reliances.store/cobrax-male-enhancement-gummies/)** reviews, the supplement has helped them bring their life back on track and also develop a bold personality. You can also be one of them by using these gummies and bringing back the joy of being satisfied and enough in your lives.
### [_**CLICK HERE TO BUY - “CobraX Male Enhancement Gummies (United States)”**_](https://www.glitco.com/get-cobrax)
[.png)](https://www.glitco.com/get-cobrax)
**Product Name:**
[CobraX Gummies](https://medium.com/@psdatsydsapicer/product-name-cobrax-male-enhancement-gummies-de5d52d13f2f?postPublishedType=initial).
**Category:**
A dietary supplement.
**Product Description:**
CobraX Gummies are a popular supplement for enhancing male health and performance.
[**Official Website:**](https://www.glitco.com/get-cobrax)
* 100% natural formula.
* Non-GMO.
* Free from allergens.
* Free from chemicals.
* Free from artificial sweeteners.
* Free from preservatives.
* Paraben free.
* Gluten-free.
* Cruelty-free.
* Manufactured in an FDA-registered facility.
* Made in the USA.
**Core Ingredients:**
Tongkat Ali, L-Arginine, Saw Palmetto, Tribulus Terrestris, etc.
**Key Benefits:**
* Supports enhanced male health.
* Improves physical performance.
* Enhances energy levels.
* Boost fertility rate.
* Increase physical stamina.
**Side Effects:**
No negative triggers. (Check out the reviews!)
**CobraX Gummies Reviews:**
Positive
**Bonus Products:**
NA.
**Shipping Charges:**
Free shipping.
**Money-Back Guarantee:**
30-day money-back guarantee.
Who Has Curated The Male Health Formula Of [CobraX Gummies](https://usa-cobrax-gummies.company.site/)?
------------------------------------------------------------------------------------------------------
The wonderful male health formula of these gummies has been developed by professionals with the goal of helping every struggling man out there. Coming from a well-reputed male health brand, this male health supplement is highly-qualitative and will help you to achieve your best without compromising on any other factor.
[CobraX Gummies](https://hackmd.io/@usacobraxgummies/CobraXGummies) have been manufactured in a GMP-certified facility and follow all the necessary purity and safety standards that are necessary. As compared to the industry standards, these gummies have much more and have been clinically proven to help you through all the means that are not available with any other male health supplements.
The most important fact about CobraX Gummies is that it is entirely vegan and non-GMO. Hence, suitable for everyone to consume without any hesitancy.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-G_ouNbGQpYXdhWX2vkpHrtPyVMfXwPTHm3sXJYAkd9BG0gW-7bb7SpWIjIj5dHKN3EQykPJF91roYgyhVoUPlMxU7Rbm2Uer-Eakk5rS6ZcrhTzwsL6iKUYsR2Hy9zWchED8rH9GezG7mHJiSK5NQCjhK-68mp9jx8dMnd01XcMQukvx5mNAoRejspg/s1397/Screenshot%20(1049).png)
[Get started with CobraX today!](https://usa-cobrax-male-enhancement-gummies.webflow.io/)
How Do [CobraX ME Gummies](https://www.scoop.it/topic/cobrax-gummies-by-usa-cobrax-gummies?curate=true&onb=1&loader=1) Work To Improve Male Performance?
--------------------------------------------------------------------------------------------------------------------------------------------------------
CobraX Gummies works by bringing all the body factors into perfect alignment and by supplementing your body cells with the necessary super nutrients for the same. As soon as you begin with your intake of the gummies, there will be increased blood flow into your corpora cavernosa. This will ultimately supply more blood to your reproductive organs and will provide you with a better physical performance rate.
The gummies also help in boosting cell regeneration naturally to expand the size of your corpora cavernosa. Through its antioxidant formula, it helps to stimulate your body to produce new cells more quickly, which ultimately helps in the formation of new tissues. With this, you are able to experience better reproductive health and functions.
Other than these, the gummies also bring a perfect hormonal balance into your body and help in boosting the levels of testosterone. Through this, it helps to improve overall physical health and performance naturally.
A Review Of The Science Behind [CobraX Gummies](https://www.youtube.com/watch?v=s4QSwytn2gE)
--------------------------------------------------------------------------------------------
A study analyzed and compared the effects on male reproductive function between the saw palmetto group and the placebo group. The findings revealed a significant increase in both total testosterone and free testosterone levels in the group that received saw palmetto extract compared to the placebo group.
Specifically, the saw palmetto group showed an average increase of 15% in total testosterone levels and a 20% increase in free testosterone levels. These results were statistically significant and provided strong evidence to support the effectiveness of saw palmetto in promoting testosterone levels.
Another study conducted in 2018 aimed to investigate the effects of L-arginine supplementation on male function and overall health. The research team recruited 100 male participants between the ages of 30 and 50 who had reported symptoms of ED and general fatigue.
The experimental group consuming l-arginine showed a significant improvement in male function compared to the control group, with an increase in the IIEF score from 12.5 to 22.3.
Furthermore, the study also revealed positive effects on overall energy levels and vitality. Participants in the experimental group experienced a notable increase in energy compared to those in the control group.
This was supported by statistical analysis, demonstrating a significant improvement in the FSS score (p < 0.05).
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQoy-0kgN1XJC3kd2daKwe2OyIsa1p4kUIHbSPUCZ23ZcPdgqD2818Es-F5J0_yUKlnAEsEEemIZGldq4sUFLITjzhyc2J2-TqdzZzjocW2xVa9X0gsjPw-SGkN5bqyNI6i0pKrsYUW4Q3vRfi627IR981MJFf0P03T3UbYNd5giv6JMZqjPF1LXmVNbY/s1461/Screenshot%20(1051).png)
[Click here to check out the official website for CobraX >>>](https://www.glitco.com/get-cobrax)
What Are The Multiple Male Health Benefits Of Consuming [CobraX Gummies](https://infogram.com/cobrax-male-enhancement-gummies-reviewed-by-sexologist-doctors-usa-1hzj4o3dj0y334p?live)?
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Packed with supernutrients and significant vitamins and plant extracts, CobraX Gummies are one of the most beneficial options in the male health industry. It provides you with the following major male health benefits.
### [CobraX Gummies](https://usacobraxgummies.contently.com/) Supports The Functioning Of Your Reproductive Organs
The primary benefit of CobraX Gummies is to improve the health and functioning of your reproductive organs. Through its potent blend of ingredients that helps in increasing the blood flow toward your reproductive organs and supplying them with all the necessary nutrients, the gummies ensure that you are able to experience enhanced reproductive health.
By boosting the cell regeneration process, the supplement provides you with enhanced reproductive organs and functioning as the development of new tissues means more quality and satisfaction. This way, you are able to be long-lasting and consistent with your performance naturally.
### [CobraX Gummies](https://www.linkedin.com/events/cobraxmaleenhancementgummies7098191833456926721/comments/) Helps In Increase Your Fertility Rate
The formula of CobraX Gummies has been enriched with ingredients that are rich in antioxidants. The daily intake of the gummies provides you with better testosterone levels and other hormone balances essential for a better fertility rate.
This way, your fertility rate is boosted without you having to engage in any kind of chemical methods or painful surgeries. The supplement helps in boosting the level of fluids in your reproductive organs and the health of your gamete.
There are a number of CobraX Gummies reviews that have appreciated the supplement for helping them conceive after being childless for decades. You can also be one of them by boosting your fertility rate to the next level with these gummies.
### [CobraX Gummies](https://soundcloud.com/cobraxgummies-554769834/cobrax-male-enhancement-gummies-reviewed-by-sexologist-doctors-usa?) Helps To Improve Your Physical Performance
Another major benefit of CobraX Gummies is that it enhances and improves your physical performance rate. With its daily use, you will be able to see an enhanced physical drive in your body that will help you in performing better as you did in your 20s.
With enhanced testosterone levels and optimal blood flow to your reproductive organs, the gummies help in enhancing your physical stamina, which ultimately helps you to stay put for longer durations and also achieve maximum satisfaction.
### CobraX Gummies Boosts Your Physical Drive And Energy Levels
CobraX Gummies are rich in elements that have been clinically proven to boost your energy levels naturally. As you age, it is natural to lose energy, drive, and stamina, but not necessary. With the recommended consumption of the gummies, you will be able to bring all three factors back into your body naturally.
Hence, you will be the same as you were in your 20s and have better control over your releases.
[.png)](https://www.glitco.com/get-cobrax)
[Get your hands on CobraX and experience the benefits now!](https://www.glitco.com/get-cobrax)
What Is In The CobraX Gummies – A Look At The Ingredients Label
---------------------------------------------------------------
Below we shall look into the ingredients that constitute this delectable male health gummy supplement:
### Tribulus Terrestris
Tribulus terrestris grows across much of the world and is commonly found in areas like Asia, Europe, etc. Tribulus terrestris appears low to the ground with highly branched stems.
Tribulus terrestris produces active compounds called saponins, which are molecules with an affinity for interacting with hormones like testosterone. These compounds help promote overall male energy and vitality by inhibiting enzymes responsible for breaking down hormones, such as luteinizing hormone-releasing hormone (LHRH) and prolactin.
This increase in hormonal activity can lead to important biological effects including increased drive and prowess in men along with other reproductive benefits such as improved emission quality.
### Saw Palmetto
Native to the warm climates of the South, saw palmetto is commonly found along the coastal regions of Florida and other southeastern states. Historically, Native Americans have harnessed the power of this plant for its medicinal properties.
The berries of the saw palmetto plant contain a variety of bioactive compounds, including fatty acids, phytosterols, and flavonoids, which are believed to contribute to its therapeutic effects.
It helps to inhibit the enzyme 5-alpha-reductase, which is responsible for converting testosterone into dihydrotestosterone (DHT). By reducing the conversion of testosterone to DHT, saw palmetto may help maintain optimal levels of testosterone in the body.
This hormonal balance is crucial for maintaining prostate health, as imbalances can lead to conditions such as benign prostatic hyperplasia (BPH).
Furthermore, saw palmetto may possess antiandrogenic properties, meaning it can block the effects of androgens in the body. This mechanism potentially plays a role in managing conditions such as androgenic alopecia, commonly known as male pattern baldness. By inhibiting the binding of DHT to hair follicles, saw palmetto can help prevent hair loss and promote healthier hair growth.
[Order now before stock runs out – click here!](https://sway.office.com/6nn2xPkgFLsbLe0S)
### L-Arginine
One primary reason why L-Arginine is included in CobraX Gummies is its role in promoting vasodilation, which refers to the widening of blood vessels. This is achieved through the conversion of L-Arginine into nitric oxide (NO), a potent signaling molecule that helps relax and expand blood vessels.
By facilitating vasodilation, L-Arginine promotes improved blood flow throughout the body, including the genital area. This enhanced circulation is essential for supporting male performance, as it ensures a sufficient supply of oxygen and nutrients to the muscles and organs involved.
Additionally, L-Arginine stimulates the production of growth hormone and insulin-like growth factor 1 (IGF-1), both of which are crucial for maintaining muscle mass, supporting energy levels, and promoting overall physical performance.
Furthermore, L-Arginine plays a pivotal role in the urea cycle, a series of biochemical reactions responsible for removing toxic ammonia from the body. By aiding in the elimination of ammonia, L-Arginine helps maintain optimal metabolic functioning, which directly influences energy levels and overall performance.
### Eurycoma Longifolia
Eurycoma Longifolia, also known as Malaysian Ginseng or Tongkat Ali, is a herbal plant commonly found in Southeast Asia, particularly in Malaysia, Indonesia, Thailand, and Vietnam.
One of the key mechanisms of Eurycoma Longifolia is its ability to stimulate the production of luteinizing hormone (LH). LH plays a crucial role in the regulation of testosterone production in the male body. By increasing LH levels, Eurycoma Longfolia indirectly promotes the production of testosterone, which is essential for male health and vitality.
Furthermore, Eurycoma Longifolia has been found to inhibit the activity of SHBG – a protein that binds to testosterone and renders it inactive. By inhibiting SHBG, Eurycoma Longifolia allows more free testosterone to circulate in the body, thereby maximizing its positive effects on male health and stamina.
[Click here to claim your discount!](https://www.glitco.com/get-cobrax)
How Should You Use [CobraX Gummies](https://www.dibiz.com/usacobraxgummies)?
----------------------------------------------------------------------------
As per the official website of CobraX Gummies, you should consume two gummies every day. With regular consumption for two weeks, you will be able to experience better reproductive organ health, and your physical performance will enhance sharply from the third or fourth week of intake.
Starting from the fourth week, you will be able to last for prolonged hours without feeling drained. In this way, the gummies will help you in reaching your reproductive equilibrium in a short duration of time naturally.
In addition to the intake, you are also advised to follow a healthy diet and stick to a workout routine that will help you in gaining concrete ground when it comes to your physical and reproductive wellness.
### Can The [CobraX Gummies](https://sites.google.com/view/usa-cobrax-gummies/home) Impact Your Health Negatively?
No. CobraX Gummies has no history of affecting any of its users negatively. Prepared using the best natural ingredients with maximum potency, the gummies have all the efficiency to suit all your needs naturally without compromising on any other factors.
Also, as mentioned earlier, the ME Gummies come from a reputed brand that produces all its supplements in an FDA-registered facility. Therefore, the gummies stick to all the necessary purity standards and are entirely vegan in nature. You can use them freely and for a long duration without worrying about any side effects.
Though, if you have any pre-existing medical situation, then consulting a doctor is a good idea before you begin with the gummies.
How Can You Get A Refund On [CobraX Gummies](https://groups.google.com/g/usa-cobrax-male-enhancement-gummies/c/z_Z-uKTnAW4)?
----------------------------------------------------------------------------------------------------------------------------
You can easily get a refund on your purchase of [CobraX Gummies](https://lookerstudio.google.com/reporting/c4679de7-bf0a-459a-8195-6376d22951c3/page/f2AaD) by going through some of the simple steps. Every purchase of the gummies is secured with a money-back guarantee of 30 days.
If you feel dissatisfied or if the results are not up to the mark, then all you need to do is to contact the company and inform them about your refund request. There might be a few steps, but there will be no extra questions asked.
Concluding Thoughts On [CobraX Gummies](https://colab.research.google.com/drive/1Qn1iElwm5bskQcaMYdKT5q3Q7zmjgsws#scrollTo=Q_PjKIURJ_1b) – Is It Worth The Hype?
----------------------------------------------------------------------------------------------------------------------------------------------------------------
[CobraX Gummies](https://buy-cobrax-male-enhancement-gummies.blogspot.com/2023/08/cobrax-male-enhancement-gummies.html) are worth every hype they have been receiving on the internet from real people. As per many of the CobraX Gummies reviews, the supplement has helped them overcome major barriers in their relationship with their partner and has successfully transformed their lives into something far better.
**Product Name** - [CobraX **Male Enhancement Gummies**](https://sites.google.com/view/usa-cobrax-gummies/home)
**Benefits** - Regain Natural Energy, Stamina, & Sex Drive, Get Harder, Longer Lasting Erections
**Customer Reviews** - ★★★★✰ 4.9/5
**Official Website** - [https://www.glitco.com/get-cobrax](https://www.glitco.com/get-cobrax)
**[.png)](https://www.glitco.com/get-cobrax)**
With the use of these natural and potent gummies, you will be able to take your physical game to the next level and satisfy both yourself and your partner to the maximum levels. There are hidden charges or side effects associated with the supplement and it is entirely reliable for your daily use.
[\[BEST PRICE\] Get CobraX for the lowest price ever!](https://www.glitco.com/get-cobrax) |
joey234/mmlu-professional_medicine-neg-prepend-fix | 2023-08-21T07:41:16.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 13678
num_examples: 5
- name: test
num_bytes: 1083116
num_examples: 272
download_size: 26475
dataset_size: 1096794
---
# Dataset Card for "mmlu-professional_medicine-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_psychology-neg-prepend-fix | 2023-08-21T07:41:29.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7999
num_examples: 5
- name: test
num_bytes: 2096464
num_examples: 612
download_size: 14733
dataset_size: 2104463
---
# Dataset Card for "mmlu-professional_psychology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-public_relations-neg-prepend-fix | 2023-08-21T07:41:39.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6140
num_examples: 5
- name: test
num_bytes: 293992
num_examples: 110
download_size: 13927
dataset_size: 300132
---
# Dataset Card for "mmlu-public_relations-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-security_studies-neg-prepend-fix | 2023-08-21T07:41:52.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 13696
num_examples: 5
- name: test
num_bytes: 1861347
num_examples: 245
download_size: 22717
dataset_size: 1875043
---
# Dataset Card for "mmlu-security_studies-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-sociology-neg-prepend-fix | 2023-08-21T07:42:03.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6037
num_examples: 5
- name: test
num_bytes: 570178
num_examples: 201
download_size: 14589
dataset_size: 576215
---
# Dataset Card for "mmlu-sociology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-us_foreign_policy-neg-prepend-fix | 2023-08-21T07:42:16.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6163
num_examples: 5
- name: test
num_bytes: 275921
num_examples: 100
download_size: 13599
dataset_size: 282084
---
# Dataset Card for "mmlu-us_foreign_policy-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-virology-neg-prepend-fix | 2023-08-21T07:42:28.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5034
num_examples: 5
- name: test
num_bytes: 356177
num_examples: 166
download_size: 12357
dataset_size: 361211
---
# Dataset Card for "mmlu-virology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-world_religions-neg-prepend-fix | 2023-08-21T07:42:40.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4146
num_examples: 5
- name: test
num_bytes: 260517
num_examples: 171
download_size: 10872
dataset_size: 264663
---
# Dataset Card for "mmlu-world_religions-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ramadita/alpaca-id-gptq | 2023-08-18T16:21:18.000Z | [
"region:us"
] | ramadita | null | null | null | 0 | 0 | Entry not found |
Default-Box/recipe_nlg-trim | 2023-08-18T14:08:27.000Z | [
"size_categories:1M<n<10M",
"language:en",
"region:us"
] | Default-Box | null | null | null | 0 | 0 | ---
language:
- en
size_categories:
- 1M<n<10M
viewer: true
--- |
whiterstudio2252001/WhiteStudioAI | 2023-08-25T03:56:15.000Z | [
"license:openrail",
"region:us"
] | whiterstudio2252001 | null | null | null | 0 | 0 | ---
license: openrail
---
|
BensonZhang/Laion-1M-nofailure | 2023-08-18T12:39:06.000Z | [
"region:us"
] | BensonZhang | null | null | null | 0 | 0 | Entry not found |
harshiitsingh/flipkart-scraped-dresses-10 | 2023-08-18T12:47:51.000Z | [
"region:us"
] | harshiitsingh | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 102203.0
num_examples: 10
download_size: 102337
dataset_size: 102203.0
---
# Dataset Card for "flipkart-scraped-dresses-10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/ner-test | 2023-08-18T12:56:53.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: id
dtype: string
- name: ner_tags
sequence: string
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 1548186
num_examples: 5216
- name: valid
num_bytes: 392764
num_examples: 1304
download_size: 0
dataset_size: 1940950
---
# Dataset Card for "ner-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
desik98/paraphrase_identification_iiith | 2023-08-18T13:06:21.000Z | [
"region:us"
] | desik98 | null | null | null | 0 | 0 | Entry not found |
desik98/Aya | 2023-08-22T18:08:55.000Z | [
"region:us"
] | desik98 | null | null | null | 0 | 0 | Entry not found |
pawanrawat0926/langchain-sample-ds | 2023-08-18T13:12:38.000Z | [
"region:us"
] | pawanrawat0926 | null | null | null | 0 | 0 | Entry not found |
HarishDemigod/OCRdataset | 2023-08-18T13:27:40.000Z | [
"license:unlicense",
"region:us"
] | HarishDemigod | null | null | null | 0 | 0 | ---
license: unlicense
---
|
aborruso/pnrr | 2023-08-18T13:40:47.000Z | [
"license:cc-by-4.0",
"region:us"
] | aborruso | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
fake-news-UFG/FakeRecogna | 2023-08-18T14:45:07.000Z | [
"language_creators:found",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"language:pt",
"region:us"
] | fake-news-UFG | null | null | null | 0 | 0 | ---
language:
- pt
language_details: pt-BR
pretty_name: FakeRecogna
size_categories:
- 10K<n<100K
multilinguality:
- monolingual
language_creators:
- found
---
# FakeRecogna
## Dataset Description
- **Homepage:** [https://github.com/Gabriel-Lino-Garcia/FakeRecogna](https://github.com/Gabriel-Lino-Garcia/FakeRecogna)
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
FakeRecogna is a dataset comprised of real and fake news.
The real news is not directly linked to fake news and vice-versa, which could lead to a biased classification.
The news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset is in Portuguese.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
If you use "FakeRecogna Dataset", please cite:
```bibtex
@inproceedings{10.1007/978-3-030-98305-5_6,
author = {Garcia, Gabriel L. and Afonso, Luis C. S. and Papa, Jo\~{a}o P.},
title = {FakeRecogna: A New Brazilian Corpus for Fake News Detection},
year = {2022},
isbn = {978-3-030-98304-8},
publisher = {Springer-Verlag},
address = {Berlin, Heidelberg},
url = {https://doi.org/10.1007/978-3-030-98305-5_6},
doi = {10.1007/978-3-030-98305-5_6},
abstract = {Fake news has become a research topic of great importance in Natural Language Processing due to its negative impact on our society. Although its pertinence, there are few datasets available in Brazilian Portuguese and mostly comprise few samples. Therefore, this paper proposes creating a new fake news dataset named FakeRecogna that contains a greater number of samples, more up-to-date news, and covering a few of the most important categories. We perform a toy evaluation over the created dataset using traditional classifiers such as Naive Bayes, Optimum-Path Forest, and Support Vector Machines. A Convolutional Neural Network is also evaluated in the context of fake news detection in the proposed dataset.},
booktitle = {Computational Processing of the Portuguese Language: 15th International Conference, PROPOR 2022, Fortaleza, Brazil, March 21–23, 2022, Proceedings},
pages = {57–67},
numpages = {11},
keywords = {Fake news, Corpus, Portuguese},
location = {Fortaleza, Brazil}
}
```
### Contributions
Thanks to [@ju-resplande](https://github.com/ju-resplande) for adding this dataset. |
anuragdak/realestate | 2023-08-21T05:37:46.000Z | [
"language:en",
"license:apache-2.0",
"region:us"
] | anuragdak | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- en
--- |
Seenka/directv-zocalos-18-agosto-5fps | 2023-08-18T14:15:44.000Z | [
"region:us"
] | Seenka | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: frame_time
dtype: time64[us]
- name: video_storage_path
dtype: string
- name: zocalo_id
dtype: string
- name: frame_number
dtype: int64
splits:
- name: train
num_bytes: 3021197.0
num_examples: 25
download_size: 1857619
dataset_size: 3021197.0
---
# Dataset Card for "directv-zocalos-18-agosto-5fps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Praneethdodedu/opposite_science | 2023-08-20T12:16:05.000Z | [
"license:openrail",
"region:us"
] | Praneethdodedu | null | null | null | 0 | 0 | ---
license: openrail
---
|
jabuticaba-br/ultron-courses | 2023-08-18T14:23:13.000Z | [
"region:us"
] | jabuticaba-br | null | null | null | 0 | 0 | Entry not found |
MythicalStats/videos | 2023-08-18T14:28:56.000Z | [
"license:openrail",
"region:us"
] | MythicalStats | null | null | null | 0 | 0 | ---
license: openrail
---
|
Seenka/directv-zocalos-agosto-5fps | 2023-08-18T14:31:14.000Z | [
"region:us"
] | Seenka | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: frame_time
dtype: time64[us]
- name: video_storage_path
dtype: string
- name: zocalo_id
dtype: string
- name: frame_number
dtype: int64
splits:
- name: train
num_bytes: 185441076.0
num_examples: 590
download_size: 168694210
dataset_size: 185441076.0
---
# Dataset Card for "directv-zocalos-agosto-5fps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZiAngGu/omni3d_v2 | 2023-08-19T02:53:16.000Z | [
"region:us"
] | ZiAngGu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
- name: label
sequence: string
splits:
- name: train
num_bytes: 18091936016.3
num_examples: 194700
download_size: 21810993407
dataset_size: 18091936016.3
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "omni3d_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Amani123/donutdataset | 2023-08-18T15:12:10.000Z | [
"region:us"
] | Amani123 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 77291761.0
num_examples: 96
download_size: 76288174
dataset_size: 77291761.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "donutdataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tehujay/djummate | 2023-08-19T09:42:19.000Z | [
"license:other",
"region:us"
] | Tehujay | null | null | null | 0 | 0 | ---
license: other
---
|
OneFly7/llama2-politosphere-fine-tuning | 2023-08-20T07:51:25.000Z | [
"region:us"
] | OneFly7 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 24345
num_examples: 113
- name: validation
num_bytes: 22093
num_examples: 113
download_size: 26501
dataset_size: 46438
---
# Dataset Card for "llama2-politosphere-fine-tuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ThankGod/melanoma | 2023-08-18T15:52:54.000Z | [
"region:us"
] | ThankGod | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Actinic keratosis
'1': Benign keratosis
'2': Dermatofibroma
'3': Melanocytic nevus
'4': Vascular lesion
splits:
- name: train
num_bytes: 5807355933.47
num_examples: 14318
- name: validation
num_bytes: 406410771.682
num_examples: 1262
- name: test
num_bytes: 393276175.928
num_examples: 1278
download_size: 5993086085
dataset_size: 6607042881.08
---
# Dataset Card for "melanoma"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/BGL_BERT_Finetuned | 2023-08-23T06:03:08.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211883038
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_BERT_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/BGL_RoBERTa_Finetuned | 2023-08-23T06:10:35.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211881880
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KhalfounMehdi/dermatology_anomaly_detection | 2023-08-18T16:36:55.000Z | [
"region:us"
] | KhalfounMehdi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 51523748.0
num_examples: 656
download_size: 51529683
dataset_size: 51523748.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: KhalfounMehdi--dermatology_anomaly_detection
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dermatology_anomaly_detection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alterneko/x | 2023-08-25T04:41:14.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
EgilKarlsen/BGL_DistilRoBERTa_Finetuned | 2023-08-23T06:17:21.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211882718
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_DistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/BGL_GPT2_Finetuned | 2023-08-23T06:24:55.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211839200
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_GPT2_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SergeyKarpenko1/autotrain-data-dfbv | 2023-08-18T16:28:35.000Z | [
"region:us"
] | SergeyKarpenko1 | null | null | null | 0 | 0 | Entry not found |
Wrathfulreap/Wrath001 | 2023-08-18T17:10:31.000Z | [
"license:apache-2.0",
"region:us"
] | Wrathfulreap | null | null | null | 0 | 0 | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and leaderboards
### Languages
English
## Dataset Structure
categorized
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
08/18/2023
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerati0ons for Using the Data
no illigal content
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Apache 2.0
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EgilKarlsen/BGL_GPTNEO_Finetuned | 2023-08-23T06:56:42.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307582709.0625
num_examples: 37500
- name: test
num_bytes: 102527570.0
num_examples: 12500
download_size: 565394003
dataset_size: 410110279.0625
---
# Dataset Card for "BGL_GPTNEO_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovi054/bangla-annotated-data | 2023-08-18T17:14:45.000Z | [
"region:us"
] | ovi054 | null | null | null | 0 | 0 | Entry not found |
miladfa7/DeepFa1 | 2023-08-19T16:36:27.000Z | [
"region:us"
] | miladfa7 | null | null | null | 1 | 0 | Entry not found |
BaekRok/vishing_data | 2023-08-19T01:24:22.000Z | [
"region:us"
] | BaekRok | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: labels
list:
- name: start
dtype: float64
- name: text
dtype: string
- name: end
dtype: float64
- name: speaker
dtype: string
- name: label
dtype: string
- name: seg_num
dtype: int64
- name: total_seg
dtype: int64
- name: prob
dtype: float64
splits:
- name: train
num_bytes: 48988768453.712
num_examples: 16496
- name: validation
num_bytes: 8026214010.768
num_examples: 2071
- name: test
num_bytes: 8851253927.312
num_examples: 2156
download_size: 16900478025
dataset_size: 65866236391.79199
---
# Dataset Card for "vishing_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yvillamil/stratio-doc | 2023-08-18T17:33:49.000Z | [
"region:us"
] | yvillamil | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 41578
num_examples: 3
download_size: 13650
dataset_size: 41578
---
# Dataset Card for "stratio-doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qillura/pusheen | 2023-08-18T21:53:50.000Z | [
"task_categories:text-to-image",
"size_categories:1M<n<10M",
"language:en",
"license:cc0-1.0",
"region:us"
] | qillura | null | null | null | 0 | 0 | ---
license: cc0-1.0
task_categories:
- text-to-image
language:
- en
pretty_name: Pusheen
size_categories:
- 1M<n<10M
--- |
SergeyKarpenko1/autotrain-data-nlp | 2023-08-18T17:57:12.000Z | [
"language:en",
"region:us"
] | SergeyKarpenko1 | null | null | null | 0 | 0 | ---
language:
- en
---
# AutoTrain Dataset for project: nlp
## Dataset Description
This dataset has been automatically processed by AutoTrain for project nlp.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u041d\u0430 \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\u043c \u0433\u0435\u043b\u0435, \u0432 \u043a\u043e\u0442\u043e\u0440\u043e\u043c \u043f\u043e\u0434\u043e\u0431\u0440\u0430\u043d\u044b \u0438\u043d\u0433\u0440\u0435\u0434\u0438\u0435\u043d\u0442\u044b \u043e\u0442\u0432\u0435\u0447\u0430\u044e\u0449\u0438\u0435 \u0437\u0430 \u043f\u043e\u0434\u0442\u044f\u0436\u043a\u0443, \u0430\u043d\u0442\u0438\u0446\u0435\u043b\u043b\u044e\u043b\u0438\u0442\u043d\u044b\u0439 \u044d\u0444\u0444\u0435\u043a\u0442 \u0438 \u0436\u0438\u0440\u043e\u0437\u0436\u0438\u0433\u0430\u043d\u0438\u0435 ! \u0412\u044b\u043f\u043e\u043b\u043d\u044f\u0442\u044c\u0441\u044f \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u0435 \u0442\u0435\u0445\u043d\u0438\u043a\u0438 \u043c\u0430\u0441\u0441\u0430\u0436\u043d\u044b\u0435, \u043f\u043e \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u043c \u043b\u0438\u043d\u0438\u044f\u043c, \u043f\u0438\u043b\u0438\u043d\u0433 \u0441\u043d\u0430\u0447\u0430\u043b\u0430, \u043f\u043e\u0442\u043e\u043c \u0432\u0431\u0438\u0432\u0430\u043d\u0438\u0435 \u0413\u0435\u043b\u044f \u0432 \u043f\u043e\u0440\u044b \u0438 \u043e\u0431\u0435\u0440\u0442\u044b\u0432\u0430\u043d\u0438\u0435 \u0432 \u0438\u043d\u0444\u0440\u0430\u043a\u0440\u0430\u0441\u043d\u043e\u0435 \u043e\u0434\u0435\u044f\u043b\u043e! \u041d\u0435 \u0431\u043e\u043b\u044c\u043d\u043e)",
"question": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435, \u0440\u0430\u0441\u0441\u043a\u0430\u0436\u0438\u0442\u0435 \u043f\u0440\u043e \u043f\u0440\u043e\u0446\u0435\u0434\u0443\u0440\u0443 \u043a\u0430\u0440\u0430\u043c\u0435\u043b\u044c\u043d\u0430\u044f \u043b\u0438\u043f\u0430\u043a\u0441\u0430\u0446\u0438\u044f , \u043a\u0430\u043a \u044d\u0442\u043e \u0434\u0435\u043b\u0430\u0435\u0442\u0441\u044f , \u0431\u043e\u043b\u044c\u043d\u043e \u044d\u0442\u043e ?",
"answers.text": [
"\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u041d\u0430 \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\u043c \u0433\u0435\u043b\u0435, \u0432 \u043a\u043e\u0442\u043e\u0440\u043e\u043c \u043f\u043e\u0434\u043e\u0431\u0440\u0430\u043d\u044b \u0438\u043d\u0433\u0440\u0435\u0434\u0438\u0435\u043d\u0442\u044b \u043e\u0442\u0432\u0435\u0447\u0430\u044e\u0449\u0438\u0435 \u0437\u0430 \u043f\u043e\u0434\u0442\u044f\u0436\u043a\u0443, \u0430\u043d\u0442\u0438\u0446\u0435\u043b\u043b\u044e\u043b\u0438\u0442\u043d\u044b\u0439 \u044d\u0444\u0444\u0435\u043a\u0442 \u0438 \u0436\u0438\u0440\u043e\u0437\u0436\u0438\u0433\u0430\u043d\u0438\u0435 ! \u0412\u044b\u043f\u043e\u043b\u043d\u044f\u0442\u044c\u0441\u044f \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u0435 \u0442\u0435\u0445\u043d\u0438\u043a\u0438 \u043c\u0430\u0441\u0441\u0430\u0436\u043d\u044b\u0435, \u043f\u043e \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u043c \u043b\u0438\u043d\u0438\u044f\u043c, \u043f\u0438\u043b\u0438\u043d\u0433 \u0441\u043d\u0430\u0447\u0430\u043b\u0430, \u043f\u043e\u0442\u043e\u043c \u0432\u0431\u0438\u0432\u0430\u043d\u0438\u0435 \u0413\u0435\u043b\u044f \u0432 \u043f\u043e\u0440\u044b \u0438 \u043e\u0431\u0435\u0440\u0442\u044b\u0432\u0430\u043d\u0438\u0435 \u0432 \u0438\u043d\u0444\u0440\u0430\u043a\u0440\u0430\u0441\u043d\u043e\u0435 \u043e\u0434\u0435\u044f\u043b\u043e! \u041d\u0435 \u0431\u043e\u043b\u044c\u043d\u043e)"
],
"answers.answer_start": [
-1
]
},
{
"context": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u0414\u0430, \u043d\u0430 \u0447\u0442\u043e \u0445\u043e\u0442\u0438\u0442\u0435?) \u0441 \u043a\u0430\u043a\u043e\u0433\u043e \u0447\u0438\u0441\u043b\u0430? \u0412 \u043a\u0430\u043a\u043e\u0435 \u0432\u0440\u0435\u043c\u044f \u0443\u0434\u043e\u0431\u043d\u043e?",
"question": "\u0410 \u043c\u043e\u0436\u043d\u043e \u043d\u0430 \u044f\u043d\u0432\u0430\u0440\u044c \u0443\u0436\u0435 \u0437\u0430\u043f\u0438\u0441\u0430\u0442\u044c\u0441\u044f ?",
"answers.text": [
"\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u0414\u0430, \u043d\u0430 \u0447\u0442\u043e \u0445\u043e\u0442\u0438\u0442\u0435?) \u0441 \u043a\u0430\u043a\u043e\u0433\u043e \u0447\u0438\u0441\u043b\u0430? \u0412 \u043a\u0430\u043a\u043e\u0435 \u0432\u0440\u0435\u043c\u044f \u0443\u0434\u043e\u0431\u043d\u043e?"
],
"answers.answer_start": [
-1
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 248 |
| valid | 63 |
|
davanstrien/newspaper-type | 2023-08-18T18:04:41.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | ---
configs:
- config_name: cleaned
data_files:
- split: train
path: cleaned/train-*
- config_name: davanstrien--newspaper-type
data_files:
- split: train
path: davanstrien--newspaper-type/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
- config_name: cleaned
features:
- name: filename
dtype: string
- name: art
dtype: float64
- name: text
dtype: string
- name: issue_name
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 878196.063583815
num_examples: 143
download_size: 0
dataset_size: 878196.063583815
- config_name: davanstrien--newspaper-type
features:
- name: filename
dtype: string
- name: art
dtype: float64
- name: text
dtype: string
- name: issue_name
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 1062433.0
num_examples: 173
download_size: 662620
dataset_size: 1062433.0
- config_name: default
features:
- name: filename
dtype: string
- name: art
dtype: float64
- name: text
dtype: string
- name: issue_name
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 1062433.0
num_examples: 173
download_size: 0
dataset_size: 1062433.0
---
# Dataset Card for "newspaper-type"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Spirit_BERT_Finetuned | 2023-08-23T05:10:06.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211763544
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_BERT_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Spirit_RoBERTa_Finetuned | 2023-08-23T05:16:58.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211788382
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Spirit_DistilRoBERTa_Finetuned | 2023-08-23T05:23:29.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211787430
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_DistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Spirit_GPT2_Finetuned | 2023-08-23T05:30:24.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211753822
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_GPT2_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wufeim/aug_text | 2023-08-30T05:04:49.000Z | [
"region:us"
] | wufeim | null | null | null | 0 | 0 | ## Video Augmented Texts Data
### VATEX
Each video contains 10 captions. In `vatex.zip`, there are:
* `test/`: a folder containing all available videos
* `vatex_public_test_english_v1.1.json`: JSON file containing all captions
Example data loading:
```py
import os
import json
path = 'vatex_public_test_english_v1.1.json'
d = json.load(open(path, 'r'))
captions = {v['videoID']: v['enCap'] for v in d}
for vname in captions:
video_path = os.path.join('test', vname+'.mp4') # path to the video
captions = captions[vname] # a list of 10 str
```
### MSR-VTT
Each video contains 1 caption. There are two files for MSR-VTT:
* `MSRVTT.zip`: contains all videos
* `MSRVTT_JSFUSION_test.csv`: contains all captions
Example data loading:
```py
import os
import pandas as pd
path = 'MSRVTT_JSFUSION_test.csv'
df = pd.read_csv(path)
vid_id_list = df['video_id'].tolist()
caption_list = df['sentence'].tolist()
for vid_id, caption in zip(vid_id_list, caption_list):
video_path = os.path.join('MSRVTT', 'videos', 'all', vid_id+'.mp4')
captions = [caption] # a list of 1 str
```
|
open-llm-leaderboard/details_Kunhao__pile-7b | 2023-08-27T12:40:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Kunhao/pile-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kunhao/pile-7b](https://huggingface.co/Kunhao/pile-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kunhao__pile-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T14:02:00.215909](https://huggingface.co/datasets/open-llm-leaderboard/details_Kunhao__pile-7b/blob/main/results_2023-08-17T14%3A02%3A00.215909.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26607314141949256,\n\
\ \"acc_stderr\": 0.031950603341667064,\n \"acc_norm\": 0.2676071883857905,\n\
\ \"acc_norm_stderr\": 0.03196207703098002,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931572,\n \"mc2\": 0.4240744665255174,\n\
\ \"mc2_stderr\": 0.014948776413812296\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2380546075085324,\n \"acc_stderr\": 0.012445770028026203,\n\
\ \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.01294203019513643\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3269269069906393,\n\
\ \"acc_stderr\": 0.004681316064444439,\n \"acc_norm\": 0.3875721967735511,\n\
\ \"acc_norm_stderr\": 0.004862003566798543\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02767845257821239,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02767845257821239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707841,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707841\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617722,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617722\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958948,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25871559633027524,\n \"acc_stderr\": 0.018776052319619624,\n \"\
acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.018776052319619624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2109704641350211,\n \"acc_stderr\": 0.026558372502661923,\n \
\ \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.026558372502661923\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n\
\ \"acc_stderr\": 0.02514093595033545,\n \"acc_norm\": 0.1794871794871795,\n\
\ \"acc_norm_stderr\": 0.02514093595033545\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686941,\n \"acc_norm\": 0.24648786717752236,\n\
\ \"acc_norm_stderr\": 0.015411308769686941\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810398,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658544,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658544\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290413,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290413\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.011044892264040772,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.011044892264040772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500347,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500347\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984924,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984924\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931572,\n \"mc2\": 0.4240744665255174,\n\
\ \"mc2_stderr\": 0.014948776413812296\n }\n}\n```"
repo_url: https://huggingface.co/Kunhao/pile-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:02:00.215909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:02:00.215909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:02:00.215909.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_02_00.215909
path:
- results_2023-08-17T14:02:00.215909.parquet
- split: latest
path:
- results_2023-08-17T14:02:00.215909.parquet
---
# Dataset Card for Evaluation run of Kunhao/pile-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kunhao/pile-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kunhao/pile-7b](https://huggingface.co/Kunhao/pile-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kunhao__pile-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T14:02:00.215909](https://huggingface.co/datasets/open-llm-leaderboard/details_Kunhao__pile-7b/blob/main/results_2023-08-17T14%3A02%3A00.215909.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26607314141949256,
"acc_stderr": 0.031950603341667064,
"acc_norm": 0.2676071883857905,
"acc_norm_stderr": 0.03196207703098002,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931572,
"mc2": 0.4240744665255174,
"mc2_stderr": 0.014948776413812296
},
"harness|arc:challenge|25": {
"acc": 0.2380546075085324,
"acc_stderr": 0.012445770028026203,
"acc_norm": 0.26791808873720135,
"acc_norm_stderr": 0.01294203019513643
},
"harness|hellaswag|10": {
"acc": 0.3269269069906393,
"acc_stderr": 0.004681316064444439,
"acc_norm": 0.3875721967735511,
"acc_norm_stderr": 0.004862003566798543
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02767845257821239,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02767845257821239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707841,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707841
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617722,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617722
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958948,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.018776052319619624,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.018776052319619624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.026558372502661923,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.026558372502661923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.02514093595033545,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.02514093595033545
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686941,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686941
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810398,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658544,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658544
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290413,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290413
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.011044892264040772,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.011044892264040772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500347,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500347
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984924,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984924
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931572,
"mc2": 0.4240744665255174,
"mc2_stderr": 0.014948776413812296
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Kunhao__pile-7b-250b-tokens | 2023-09-17T06:32:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Kunhao/pile-7b-250b-tokens
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kunhao/pile-7b-250b-tokens](https://huggingface.co/Kunhao/pile-7b-250b-tokens)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kunhao__pile-7b-250b-tokens\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T06:31:55.680940](https://huggingface.co/datasets/open-llm-leaderboard/details_Kunhao__pile-7b-250b-tokens/blob/main/results_2023-09-17T06-31-55.680940.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268413,\n \"f1\": 0.02978817114093967,\n\
\ \"f1_stderr\": 0.0010045845151481873,\n \"acc\": 0.26666299658982046,\n\
\ \"acc_stderr\": 0.008015854967176925\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268413,\n\
\ \"f1\": 0.02978817114093967,\n \"f1_stderr\": 0.0010045845151481873\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948061\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5280189423835833,\n \"acc_stderr\": 0.014030404213405788\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Kunhao/pile-7b-250b-tokens
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T06_31_55.680940
path:
- '**/details_harness|drop|3_2023-09-17T06-31-55.680940.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T06-31-55.680940.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T06_31_55.680940
path:
- '**/details_harness|gsm8k|5_2023-09-17T06-31-55.680940.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T06-31-55.680940.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:43:31.029227.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:43:31.029227.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:43:31.029227.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T06_31_55.680940
path:
- '**/details_harness|winogrande|5_2023-09-17T06-31-55.680940.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T06-31-55.680940.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_43_31.029227
path:
- results_2023-08-17T19:43:31.029227.parquet
- split: 2023_09_17T06_31_55.680940
path:
- results_2023-09-17T06-31-55.680940.parquet
- split: latest
path:
- results_2023-09-17T06-31-55.680940.parquet
---
# Dataset Card for Evaluation run of Kunhao/pile-7b-250b-tokens
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kunhao/pile-7b-250b-tokens
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kunhao/pile-7b-250b-tokens](https://huggingface.co/Kunhao/pile-7b-250b-tokens) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kunhao__pile-7b-250b-tokens",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T06:31:55.680940](https://huggingface.co/datasets/open-llm-leaderboard/details_Kunhao__pile-7b-250b-tokens/blob/main/results_2023-09-17T06-31-55.680940.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268413,
"f1": 0.02978817114093967,
"f1_stderr": 0.0010045845151481873,
"acc": 0.26666299658982046,
"acc_stderr": 0.008015854967176925
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268413,
"f1": 0.02978817114093967,
"f1_stderr": 0.0010045845151481873
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948061
},
"harness|winogrande|5": {
"acc": 0.5280189423835833,
"acc_stderr": 0.014030404213405788
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Corianas__1.3b | 2023-08-27T12:40:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Corianas/1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Corianas/1.3b](https://huggingface.co/Corianas/1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__1.3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T07:03:11.668296](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__1.3b/blob/main/results_2023-08-18T07%3A03%3A11.668296.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2683147756647564,\n\
\ \"acc_stderr\": 0.03196561920514779,\n \"acc_norm\": 0.2696986060829294,\n\
\ \"acc_norm_stderr\": 0.03197641826480241,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080507,\n \"mc2\": 0.39015532406426673,\n\
\ \"mc2_stderr\": 0.014644227143696153\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2431740614334471,\n \"acc_stderr\": 0.012536554144587092,\n\
\ \"acc_norm\": 0.27303754266211605,\n \"acc_norm_stderr\": 0.013019332762635743\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33120892252539336,\n\
\ \"acc_stderr\": 0.004696861625496929,\n \"acc_norm\": 0.3829914359689305,\n\
\ \"acc_norm_stderr\": 0.0048512275270709\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.13,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.13,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.0277242364927009,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.0277242364927009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496224,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496224\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03724563619774632,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03724563619774632\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147124,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147124\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.022755204959542936,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.022755204959542936\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732523,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958945,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958945\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3596330275229358,\n \"acc_stderr\": 0.020575234660123783,\n \"\
acc_norm\": 0.3596330275229358,\n \"acc_norm_stderr\": 0.020575234660123783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484256,\n\
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\
\ \"acc_stderr\": 0.024413587174907412,\n \"acc_norm\": 0.15695067264573992,\n\
\ \"acc_norm_stderr\": 0.024413587174907412\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n\
\ \"acc_stderr\": 0.030118210106942666,\n \"acc_norm\": 0.3034188034188034,\n\
\ \"acc_norm_stderr\": 0.030118210106942666\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n\
\ \"acc_stderr\": 0.014583812465862557,\n \"acc_norm\": 0.210727969348659,\n\
\ \"acc_norm_stderr\": 0.014583812465862557\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.02386800326250012,\n\
\ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.02386800326250012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687758,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687758\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.036942843353377997,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.036942843353377997\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.02971932942241747,\n\
\ \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.02971932942241747\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080507,\n \"mc2\": 0.39015532406426673,\n\
\ \"mc2_stderr\": 0.014644227143696153\n }\n}\n```"
repo_url: https://huggingface.co/Corianas/1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:03:11.668296.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- results_2023-08-18T07:03:11.668296.parquet
- split: latest
path:
- results_2023-08-18T07:03:11.668296.parquet
---
# Dataset Card for Evaluation run of Corianas/1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/1.3b](https://huggingface.co/Corianas/1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__1.3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T07:03:11.668296](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__1.3b/blob/main/results_2023-08-18T07%3A03%3A11.668296.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2683147756647564,
"acc_stderr": 0.03196561920514779,
"acc_norm": 0.2696986060829294,
"acc_norm_stderr": 0.03197641826480241,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080507,
"mc2": 0.39015532406426673,
"mc2_stderr": 0.014644227143696153
},
"harness|arc:challenge|25": {
"acc": 0.2431740614334471,
"acc_stderr": 0.012536554144587092,
"acc_norm": 0.27303754266211605,
"acc_norm_stderr": 0.013019332762635743
},
"harness|hellaswag|10": {
"acc": 0.33120892252539336,
"acc_stderr": 0.004696861625496929,
"acc_norm": 0.3829914359689305,
"acc_norm_stderr": 0.0048512275270709
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.13,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.13,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.0277242364927009,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.0277242364927009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496224,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496224
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147124,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147124
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2,
"acc_stderr": 0.022755204959542936,
"acc_norm": 0.2,
"acc_norm_stderr": 0.022755204959542936
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732523,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958945,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3596330275229358,
"acc_stderr": 0.020575234660123783,
"acc_norm": 0.3596330275229358,
"acc_norm_stderr": 0.020575234660123783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484256,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.024413587174907412,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.024413587174907412
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.030118210106942666,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.030118210106942666
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862557,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862557
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.02386800326250012,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.02386800326250012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687758,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687758
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353377997,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353377997
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.02971932942241747,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.02971932942241747
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080507,
"mc2": 0.39015532406426673,
"mc2_stderr": 0.014644227143696153
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.